[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11792 1727096117.13855: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11792 1727096117.14323: Added group all to inventory 11792 1727096117.14325: Added group ungrouped to inventory 11792 1727096117.14329: Group all now contains ungrouped 11792 1727096117.14332: Examining possible inventory source: /tmp/network-EuO/inventory.yml 11792 1727096117.40911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11792 1727096117.40972: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11792 1727096117.40995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11792 1727096117.41053: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11792 1727096117.41535: Loaded config def from plugin (inventory/script) 11792 1727096117.41538: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11792 1727096117.41581: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11792 1727096117.42080: Loaded config def from plugin (inventory/yaml) 11792 1727096117.42082: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11792 1727096117.42175: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11792 1727096117.43429: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11792 1727096117.43433: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11792 1727096117.43436: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11792 1727096117.43443: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11792 1727096117.43448: Loading data from /tmp/network-EuO/inventory.yml 11792 1727096117.43922: /tmp/network-EuO/inventory.yml was not parsable by auto 11792 1727096117.43988: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11792 1727096117.44030: Loading data from /tmp/network-EuO/inventory.yml 11792 1727096117.44523: group all already in inventory 11792 1727096117.44530: set inventory_file for managed_node1 11792 1727096117.44534: set inventory_dir for managed_node1 11792 1727096117.44535: Added host managed_node1 to inventory 11792 1727096117.44538: Added host managed_node1 to group all 11792 1727096117.44539: set ansible_host for managed_node1 11792 1727096117.44539: set ansible_ssh_extra_args for managed_node1 11792 1727096117.44542: set inventory_file for managed_node2 11792 1727096117.44545: set inventory_dir for managed_node2 11792 1727096117.44546: Added host managed_node2 to inventory 11792 1727096117.44547: Added host managed_node2 to group all 11792 1727096117.44548: set ansible_host for managed_node2 11792 1727096117.44549: set ansible_ssh_extra_args for managed_node2 11792 1727096117.44551: set inventory_file for managed_node3 11792 1727096117.44553: set inventory_dir for managed_node3 11792 1727096117.44554: Added host managed_node3 to inventory 11792 1727096117.44555: Added host managed_node3 to group all 11792 1727096117.44556: set ansible_host for managed_node3 11792 1727096117.44557: set ansible_ssh_extra_args for managed_node3 11792 1727096117.44560: Reconcile groups and hosts in inventory. 11792 1727096117.44563: Group ungrouped now contains managed_node1 11792 1727096117.44565: Group ungrouped now contains managed_node2 11792 1727096117.44569: Group ungrouped now contains managed_node3 11792 1727096117.44648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11792 1727096117.45145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11792 1727096117.45195: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11792 1727096117.45226: Loaded config def from plugin (vars/host_group_vars) 11792 1727096117.45228: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11792 1727096117.45236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11792 1727096117.45244: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11792 1727096117.45553: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11792 1727096117.46336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096117.46451: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11792 1727096117.46610: Loaded config def from plugin (connection/local) 11792 1727096117.46613: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11792 1727096117.48060: Loaded config def from plugin (connection/paramiko_ssh) 11792 1727096117.48065: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11792 1727096117.50027: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11792 1727096117.50185: Loaded config def from plugin (connection/psrp) 11792 1727096117.50189: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11792 1727096117.51651: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11792 1727096117.51885: Loaded config def from plugin (connection/ssh) 11792 1727096117.51889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11792 1727096117.55994: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11792 1727096117.56100: Loaded config def from plugin (connection/winrm) 11792 1727096117.56104: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11792 1727096117.56253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11792 1727096117.56425: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11792 1727096117.56578: Loaded config def from plugin (shell/cmd) 11792 1727096117.56580: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11792 1727096117.56607: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11792 1727096117.56785: Loaded config def from plugin (shell/powershell) 11792 1727096117.56787: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11792 1727096117.56842: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11792 1727096117.57275: Loaded config def from plugin (shell/sh) 11792 1727096117.57278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11792 1727096117.57429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11792 1727096117.57557: Loaded config def from plugin (become/runas) 11792 1727096117.57560: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11792 1727096117.58113: Loaded config def from plugin (become/su) 11792 1727096117.58115: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11792 1727096117.58484: Loaded config def from plugin (become/sudo) 11792 1727096117.58487: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11792 1727096117.58526: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 11792 1727096117.59281: in VariableManager get_vars() 11792 1727096117.59303: done with get_vars() 11792 1727096117.59660: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11792 1727096117.66643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11792 1727096117.66819: in VariableManager get_vars() 11792 1727096117.66825: done with get_vars() 11792 1727096117.66828: variable 'playbook_dir' from source: magic vars 11792 1727096117.66829: variable 'ansible_playbook_python' from source: magic vars 11792 1727096117.66830: variable 'ansible_config_file' from source: magic vars 11792 1727096117.66830: variable 'groups' from source: magic vars 11792 1727096117.66831: variable 'omit' from source: magic vars 11792 1727096117.66832: variable 'ansible_version' from source: magic vars 11792 1727096117.66833: variable 'ansible_check_mode' from source: magic vars 11792 1727096117.66833: variable 'ansible_diff_mode' from source: magic vars 11792 1727096117.66834: variable 'ansible_forks' from source: magic vars 11792 1727096117.66835: variable 'ansible_inventory_sources' from source: magic vars 11792 1727096117.66835: variable 'ansible_skip_tags' from source: magic vars 11792 1727096117.66836: variable 'ansible_limit' from source: magic vars 11792 1727096117.66837: variable 'ansible_run_tags' from source: magic vars 11792 1727096117.66838: variable 'ansible_verbosity' from source: magic vars 11792 1727096117.66874: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml 11792 1727096117.67507: in VariableManager get_vars() 11792 1727096117.67526: done with get_vars() 11792 1727096117.67728: in VariableManager get_vars() 11792 1727096117.67742: done with get_vars() 11792 1727096117.68031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11792 1727096117.68046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11792 1727096117.68640: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11792 1727096117.69015: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11792 1727096117.69018: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 11792 1727096117.69095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11792 1727096117.69121: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11792 1727096117.69581: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11792 1727096117.69758: Loaded config def from plugin (callback/default) 11792 1727096117.69761: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11792 1727096117.71486: Loaded config def from plugin (callback/junit) 11792 1727096117.71490: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11792 1727096117.71537: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11792 1727096117.71721: Loaded config def from plugin (callback/minimal) 11792 1727096117.71724: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11792 1727096117.71764: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11792 1727096117.71929: Loaded config def from plugin (callback/tree) 11792 1727096117.71932: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11792 1727096117.72244: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11792 1727096117.72247: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_options_nm.yml ******************************************** 2 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 11792 1727096117.72320: in VariableManager get_vars() 11792 1727096117.72336: done with get_vars() 11792 1727096117.72342: in VariableManager get_vars() 11792 1727096117.72350: done with get_vars() 11792 1727096117.72355: variable 'omit' from source: magic vars 11792 1727096117.72396: in VariableManager get_vars() 11792 1727096117.72576: done with get_vars() 11792 1727096117.72599: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_options.yml' with nm as provider] ***** 11792 1727096117.73484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11792 1727096117.75584: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11792 1727096117.75615: getting the remaining hosts for this loop 11792 1727096117.75617: done getting the remaining hosts for this loop 11792 1727096117.75620: getting the next task for host managed_node2 11792 1727096117.75624: done getting next task for host managed_node2 11792 1727096117.75626: ^ task is: TASK: Gathering Facts 11792 1727096117.75628: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096117.75630: getting variables 11792 1727096117.75631: in VariableManager get_vars() 11792 1727096117.75642: Calling all_inventory to load vars for managed_node2 11792 1727096117.75644: Calling groups_inventory to load vars for managed_node2 11792 1727096117.75647: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096117.75660: Calling all_plugins_play to load vars for managed_node2 11792 1727096117.75674: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096117.75677: Calling groups_plugins_play to load vars for managed_node2 11792 1727096117.75715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096117.75769: done with get_vars() 11792 1727096117.75777: done getting variables 11792 1727096117.75853: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Monday 23 September 2024 08:55:17 -0400 (0:00:00.037) 0:00:00.037 ****** 11792 1727096117.75880: entering _queue_task() for managed_node2/gather_facts 11792 1727096117.75881: Creating lock for gather_facts 11792 1727096117.76240: worker is 1 (out of 1 available) 11792 1727096117.76252: exiting _queue_task() for managed_node2/gather_facts 11792 1727096117.76267: done queuing things up, now waiting for results queue to drain 11792 1727096117.76473: waiting for pending results... 11792 1727096117.76687: running TaskExecutor() for managed_node2/TASK: Gathering Facts 11792 1727096117.76693: in run() - task 0afff68d-5257-d9c7-3fc0-000000000015 11792 1727096117.76696: variable 'ansible_search_path' from source: unknown 11792 1727096117.76699: calling self._execute() 11792 1727096117.76738: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096117.76762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096117.76778: variable 'omit' from source: magic vars 11792 1727096117.76886: variable 'omit' from source: magic vars 11792 1727096117.76919: variable 'omit' from source: magic vars 11792 1727096117.76960: variable 'omit' from source: magic vars 11792 1727096117.77011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096117.77057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096117.77082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096117.77103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096117.77117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096117.77151: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096117.77160: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096117.77170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096117.77278: Set connection var ansible_timeout to 10 11792 1727096117.77359: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096117.77362: Set connection var ansible_shell_executable to /bin/sh 11792 1727096117.77364: Set connection var ansible_pipelining to False 11792 1727096117.77366: Set connection var ansible_shell_type to sh 11792 1727096117.77370: Set connection var ansible_connection to ssh 11792 1727096117.77372: variable 'ansible_shell_executable' from source: unknown 11792 1727096117.77374: variable 'ansible_connection' from source: unknown 11792 1727096117.77376: variable 'ansible_module_compression' from source: unknown 11792 1727096117.77378: variable 'ansible_shell_type' from source: unknown 11792 1727096117.77380: variable 'ansible_shell_executable' from source: unknown 11792 1727096117.77382: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096117.77384: variable 'ansible_pipelining' from source: unknown 11792 1727096117.77386: variable 'ansible_timeout' from source: unknown 11792 1727096117.77388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096117.77597: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 11792 1727096117.77611: variable 'omit' from source: magic vars 11792 1727096117.77620: starting attempt loop 11792 1727096117.77625: running the handler 11792 1727096117.77643: variable 'ansible_facts' from source: unknown 11792 1727096117.77664: _low_level_execute_command(): starting 11792 1727096117.77681: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096117.78439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096117.78456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096117.78514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096117.78531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096117.78564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096117.78629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096117.80418: stdout chunk (state=3): >>>/root <<< 11792 1727096117.80628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096117.80632: stdout chunk (state=3): >>><<< 11792 1727096117.80636: stderr chunk (state=3): >>><<< 11792 1727096117.80825: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096117.80829: _low_level_execute_command(): starting 11792 1727096117.80833: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589 `" && echo ansible-tmp-1727096117.807103-11825-119760924985589="` echo /root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589 `" ) && sleep 0' 11792 1727096117.81653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096117.81886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096117.81909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096117.83965: stdout chunk (state=3): >>>ansible-tmp-1727096117.807103-11825-119760924985589=/root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589 <<< 11792 1727096117.84135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096117.84139: stdout chunk (state=3): >>><<< 11792 1727096117.84142: stderr chunk (state=3): >>><<< 11792 1727096117.84295: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096117.807103-11825-119760924985589=/root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096117.84299: variable 'ansible_module_compression' from source: unknown 11792 1727096117.84386: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11792 1727096117.84394: ANSIBALLZ: Acquiring lock 11792 1727096117.84401: ANSIBALLZ: Lock acquired: 139635227775856 11792 1727096117.84409: ANSIBALLZ: Creating module 11792 1727096118.38130: ANSIBALLZ: Writing module into payload 11792 1727096118.38344: ANSIBALLZ: Writing module 11792 1727096118.38538: ANSIBALLZ: Renaming module 11792 1727096118.38541: ANSIBALLZ: Done creating module 11792 1727096118.38544: variable 'ansible_facts' from source: unknown 11792 1727096118.38546: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096118.38551: _low_level_execute_command(): starting 11792 1727096118.38554: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11792 1727096118.39997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096118.40086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096118.40319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096118.40334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096118.40418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096118.42130: stdout chunk (state=3): >>>PLATFORM <<< 11792 1727096118.42198: stdout chunk (state=3): >>>Linux <<< 11792 1727096118.42278: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 11792 1727096118.42553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096118.42556: stdout chunk (state=3): >>><<< 11792 1727096118.42558: stderr chunk (state=3): >>><<< 11792 1727096118.42576: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096118.42598 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11792 1727096118.42655: _low_level_execute_command(): starting 11792 1727096118.42946: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11792 1727096118.43191: Sending initial data 11792 1727096118.43194: Sent initial data (1181 bytes) 11792 1727096118.44053: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096118.44076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096118.44237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096118.44403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096118.44407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096118.47933: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11792 1727096118.48501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096118.48562: stderr chunk (state=3): >>><<< 11792 1727096118.48573: stdout chunk (state=3): >>><<< 11792 1727096118.48708: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096118.48924: variable 'ansible_facts' from source: unknown 11792 1727096118.48928: variable 'ansible_facts' from source: unknown 11792 1727096118.48930: variable 'ansible_module_compression' from source: unknown 11792 1727096118.48943: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11792 1727096118.49014: variable 'ansible_facts' from source: unknown 11792 1727096118.49408: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/AnsiballZ_setup.py 11792 1727096118.49812: Sending initial data 11792 1727096118.49944: Sent initial data (153 bytes) 11792 1727096118.51570: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096118.51590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096118.51745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096118.51764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096118.51869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096118.51888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096118.51891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096118.52297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096118.54008: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096118.54035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096118.54081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp9jcgygo3 /root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/AnsiballZ_setup.py <<< 11792 1727096118.54085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/AnsiballZ_setup.py" <<< 11792 1727096118.54118: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp9jcgygo3" to remote "/root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/AnsiballZ_setup.py" <<< 11792 1727096118.58166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096118.58178: stdout chunk (state=3): >>><<< 11792 1727096118.58181: stderr chunk (state=3): >>><<< 11792 1727096118.58183: done transferring module to remote 11792 1727096118.58186: _low_level_execute_command(): starting 11792 1727096118.58188: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/ /root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/AnsiballZ_setup.py && sleep 0' 11792 1727096118.60047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096118.60278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096118.60282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096118.62377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096118.62389: stdout chunk (state=3): >>><<< 11792 1727096118.62413: stderr chunk (state=3): >>><<< 11792 1727096118.62631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096118.62634: _low_level_execute_command(): starting 11792 1727096118.62636: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/AnsiballZ_setup.py && sleep 0' 11792 1727096118.63842: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096118.63851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096118.63854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096118.63856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096118.63959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096118.63964: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096118.64088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096118.64109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096118.64200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096118.64300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096118.66695: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 11792 1727096118.66734: stdout chunk (state=3): >>>import '_io' # <<< 11792 1727096118.66738: stdout chunk (state=3): >>>import 'marshal' # <<< 11792 1727096118.66944: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 11792 1727096118.66977: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096118.66981: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11792 1727096118.66983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11792 1727096118.67206: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbf0184d0> <<< 11792 1727096118.67210: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbefe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbf01aa50> import '_signal' # import '_abc' # import 'abc' # <<< 11792 1727096118.67212: stdout chunk (state=3): >>>import 'io' # <<< 11792 1727096118.67286: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # <<< 11792 1727096118.67297: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11792 1727096118.67324: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 11792 1727096118.67452: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbedc9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096118.67508: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbedc9fa0> <<< 11792 1727096118.67544: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11792 1727096118.68297: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096118.68306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 11792 1727096118.68309: stdout chunk (state=3): >>> <<< 11792 1727096118.68311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee07dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee07fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11792 1727096118.68451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee3f800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee3fe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee1faa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee1d1c0> <<< 11792 1727096118.68547: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee04f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11792 1727096118.68807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee5f6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee5e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee1e060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee06e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee947a0> <<< 11792 1727096118.68814: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee04200> <<< 11792 1727096118.69062: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11792 1727096118.69073: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.69081: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbee94c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee94b00> <<< 11792 1727096118.69311: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbee94ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee02d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee955b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee95280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee964b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbeeac680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbeeadd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11792 1727096118.69314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11792 1727096118.69377: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbeeaebd0> <<< 11792 1727096118.69389: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.69392: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbeeaf230> <<< 11792 1727096118.69394: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbeeae120> <<< 11792 1727096118.69397: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 11792 1727096118.69399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11792 1727096118.69532: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.69536: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbeeafcb0> <<< 11792 1727096118.69538: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbeeaf3e0> <<< 11792 1727096118.69540: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee96450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11792 1727096118.69674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11792 1727096118.69684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11792 1727096118.69835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebafb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebd8650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebd83b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebd8680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11792 1727096118.69845: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.70186: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebd8fb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebd9910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebd8860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebadd60> <<< 11792 1727096118.70190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11792 1727096118.70241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebdacc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebd97f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee96ba0> <<< 11792 1727096118.70304: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11792 1727096118.70665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec07020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec2b410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11792 1727096118.70750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11792 1727096118.70788: stdout chunk (state=3): >>>import 'ntpath' # <<< 11792 1727096118.70802: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec881a0> <<< 11792 1727096118.70809: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11792 1727096118.70955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11792 1727096118.71071: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec8a900> <<< 11792 1727096118.71184: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec882c0> <<< 11792 1727096118.71241: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec551c0> <<< 11792 1727096118.71285: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe52d2e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec2a210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebdbbf0> <<< 11792 1727096118.71645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11792 1727096118.71758: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3cbec2a570> <<< 11792 1727096118.72095: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ds7gs2za/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11792 1727096118.72303: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.72355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11792 1727096118.72359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11792 1727096118.72392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11792 1727096118.72526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11792 1727096118.72684: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe58f050> import '_typing' # <<< 11792 1727096118.72904: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe56df40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe56d0a0> <<< 11792 1727096118.72908: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 11792 1727096118.72961: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.72971: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 11792 1727096118.73040: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.74883: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.75947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 11792 1727096118.75995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe58cf20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11792 1727096118.76312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11792 1727096118.76375: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe5be9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe5be750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe5be060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe5be930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebd8440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe5bf6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe5bf830> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11792 1727096118.76431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11792 1727096118.76499: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe5bfd70> import 'pwd' # <<< 11792 1727096118.76540: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11792 1727096118.76657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe429b20> <<< 11792 1727096118.76699: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe42b710> <<< 11792 1727096118.76730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42c0e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11792 1727096118.76805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11792 1727096118.76856: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42d250> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11792 1727096118.76875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11792 1727096118.76958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42fd10> <<< 11792 1727096118.77035: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbec57dd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42e000> <<< 11792 1727096118.77097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11792 1727096118.77158: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11792 1727096118.77349: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11792 1727096118.77354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11792 1727096118.77415: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe437b90> import '_tokenize' # <<< 11792 1727096118.77510: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe436660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe4363c0> <<< 11792 1727096118.77563: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11792 1727096118.77711: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe436930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42e4e0> <<< 11792 1727096118.77760: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe47bd70> <<< 11792 1727096118.77865: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe47bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11792 1727096118.77872: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.77924: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe47d9d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe47d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11792 1727096118.77954: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.78089: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe47ff20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe47e090> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11792 1727096118.78148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11792 1727096118.78171: stdout chunk (state=3): >>>import '_string' # <<< 11792 1727096118.78208: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe483740> <<< 11792 1727096118.78366: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe480110> <<< 11792 1727096118.78456: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe484500> <<< 11792 1727096118.78518: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe484950> <<< 11792 1727096118.78571: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe484a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe47c110> <<< 11792 1727096118.78608: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 11792 1727096118.78650: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11792 1727096118.78746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.78783: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096118.78941: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe310170> <<< 11792 1727096118.79062: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.79097: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.79118: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe3115b0> <<< 11792 1727096118.79157: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe486900><<< 11792 1727096118.79161: stdout chunk (state=3): >>> <<< 11792 1727096118.79217: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096118.79222: stdout chunk (state=3): >>> <<< 11792 1727096118.79225: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096118.79230: stdout chunk (state=3): >>> <<< 11792 1727096118.79246: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe487cb0> <<< 11792 1727096118.79277: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe486540> <<< 11792 1727096118.79294: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096118.79329: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11792 1727096118.79332: stdout chunk (state=3): >>> <<< 11792 1727096118.79377: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11792 1727096118.79538: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096118.79628: stdout chunk (state=3): >>> <<< 11792 1727096118.79692: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096118.79715: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11792 1727096118.79731: stdout chunk (state=3): >>> <<< 11792 1727096118.79742: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 11792 1727096118.79777: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.79807: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.79834: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 11792 1727096118.79866: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11792 1727096118.79873: stdout chunk (state=3): >>> <<< 11792 1727096118.80060: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096118.80131: stdout chunk (state=3): >>> <<< 11792 1727096118.80276: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096118.80279: stdout chunk (state=3): >>> <<< 11792 1727096118.81281: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096118.81297: stdout chunk (state=3): >>> <<< 11792 1727096118.82309: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11792 1727096118.82312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096118.82593: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe3157c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3165d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3117f0> <<< 11792 1727096118.82623: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.82643: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11792 1727096118.82897: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.83101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe316360> # zipimport: zlib available <<< 11792 1727096118.83570: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.84252: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.84285: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11792 1727096118.84348: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.84393: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11792 1727096118.84410: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.84508: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.84664: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11792 1727096118.84677: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.84695: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11792 1727096118.84840: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.84843: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11792 1727096118.84945: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.85268: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.85644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11792 1727096118.85750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11792 1727096118.85762: stdout chunk (state=3): >>>import '_ast' # <<< 11792 1727096118.85874: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe317830> <<< 11792 1727096118.85915: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.86002: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.86135: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 11792 1727096118.86147: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 11792 1727096118.86180: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.86264: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.86286: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11792 1727096118.86334: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.86382: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.86428: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.86537: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.86656: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11792 1727096118.86729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096118.86883: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe322030> <<< 11792 1727096118.86924: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe31f3b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 11792 1727096118.86992: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.87060: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.87318: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.87535: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11792 1727096118.87898: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe40aae0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe4fe7b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe322210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe321df0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11792 1727096118.88110: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.88146: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 11792 1727096118.88223: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.88377: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # <<< 11792 1727096118.88381: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.88627: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.88740: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.88904: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.88979: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11792 1727096118.89172: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b6510> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11792 1727096118.89176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11792 1727096118.89258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbdfe4170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbdfe44a0> <<< 11792 1727096118.89317: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3a6c30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b6ff0> <<< 11792 1727096118.89337: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b4bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b4860> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11792 1727096118.89386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11792 1727096118.89606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbdfe7440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbdfe6cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbdfe6ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbdfe6120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11792 1727096118.89825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbdfe75c0> <<< 11792 1727096118.89947: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe0460c0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe0440e0> <<< 11792 1727096118.90046: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b48c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 11792 1727096118.90186: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 11792 1727096118.90189: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.90251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11792 1727096118.90255: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.90322: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.90397: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11792 1727096118.90455: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.90485: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 11792 1727096118.90620: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 11792 1727096118.90769: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.90779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 11792 1727096118.90938: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 11792 1727096118.90941: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.90998: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.91119: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.91223: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11792 1727096118.92102: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.92831: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11792 1727096118.92839: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.92909: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.92914: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93021: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 11792 1727096118.93026: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.93061: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 11792 1727096118.93101: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93199: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 11792 1727096118.93208: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93270: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 11792 1727096118.93301: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93355: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11792 1727096118.93363: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93480: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93516: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11792 1727096118.93545: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe047980> <<< 11792 1727096118.93571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11792 1727096118.93675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11792 1727096118.93723: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe046d80> import 'ansible.module_utils.facts.system.local' # <<< 11792 1727096118.93783: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93812: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.93881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11792 1727096118.93902: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.94033: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.94087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 11792 1727096118.94264: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.94270: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 11792 1727096118.94297: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.94335: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11792 1727096118.94398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11792 1727096118.95013: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe082390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe0721b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.95067: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11792 1727096118.95100: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.95220: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.95352: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.95536: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.95782: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 11792 1727096118.95786: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.95845: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.95907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 11792 1727096118.96029: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11792 1727096118.96040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11792 1727096118.96089: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096118.96170: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe095e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe095a90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 11792 1727096118.96184: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 11792 1727096118.96473: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.96806: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.96835: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 11792 1727096118.97006: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.97145: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.97259: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 11792 1727096118.97296: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.97308: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.97335: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.97609: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.97822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 11792 1727096118.97826: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.98033: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.98271: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096118.98310: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096118.99277: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.00146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 11792 1727096119.00242: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.00373: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.00485: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11792 1727096119.00837: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 11792 1727096119.01057: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.01302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11792 1727096119.01336: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 11792 1727096119.01370: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.01440: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.01479: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 11792 1727096119.01514: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.01634: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.01793: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.02129: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.02463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11792 1727096119.02481: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.02528: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.02580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11792 1727096119.02623: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096119.02659: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 11792 1727096119.02767: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.02886: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 11792 1727096119.02903: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.03019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 11792 1727096119.03201: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096119.03285: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11792 1727096119.03306: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.03798: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.04160: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 11792 1727096119.04244: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.04335: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11792 1727096119.04339: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.04370: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.04429: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 11792 1727096119.04466: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.04513: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 11792 1727096119.04653: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 11792 1727096119.04694: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.04734: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.04877: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11792 1727096119.04970: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 11792 1727096119.04988: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.05042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 11792 1727096119.05073: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.05136: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.05164: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.05234: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.05346: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.05532: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 11792 1727096119.05561: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.05584: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.05658: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11792 1727096119.05977: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.06306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11792 1727096119.06310: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.06357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.06425: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 11792 1727096119.06492: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.06564: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 11792 1727096119.06742: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.06804: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 11792 1727096119.06817: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.06960: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.07086: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11792 1727096119.07202: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.08042: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11792 1727096119.08480: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbde2e180> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde2cf20> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde24260> <<< 11792 1727096119.25263: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 11792 1727096119.25524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde2fef0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde74980> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096119.25529: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde760f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde75b80> <<< 11792 1727096119.25930: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11792 1727096119.51572: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.46240234375, "5m": 0.3916015625, "15m": 0.1787109375}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user<<< 11792 1727096119.51706: stdout chunk (state=3): >>>/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 261, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794197504, "block_size": 4096, "block_total": 65519099, "block_available": 63914599, "block_used": 1604500, "inode_total": 131070960, "inode_available": 131029117, "inode_used": 41843, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "19", "epoch": "1727096119", "epoch_int": "1727096119", "date": "2024-09-23", "time": "08:55:19", "iso8601_micro": "2024-09-23T12:55:19.459608Z", "iso8601": "2024-09-23T12:55:19Z", "iso8601_basic": "20240923T085519459608", "iso8601_basic_short": "20240923T085519", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11792 1727096119.52674: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 11792 1727096119.52687: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 11792 1727096119.52730: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2<<< 11792 1727096119.52737: stdout chunk (state=3): >>> # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils <<< 11792 1727096119.52741: stdout chunk (state=3): >>># destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 11792 1727096119.52770: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux <<< 11792 1727096119.52916: stdout chunk (state=3): >>># cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction <<< 11792 1727096119.52923: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux <<< 11792 1727096119.52926: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi <<< 11792 1727096119.52928: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other <<< 11792 1727096119.52930: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network <<< 11792 1727096119.52932: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11792 1727096119.53315: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11792 1727096119.53852: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11792 1727096119.53872: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 11792 1727096119.53899: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep<<< 11792 1727096119.53937: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux<<< 11792 1727096119.53944: stdout chunk (state=3): >>> <<< 11792 1727096119.53963: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes<<< 11792 1727096119.54000: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 11792 1727096119.54005: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser <<< 11792 1727096119.54023: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon<<< 11792 1727096119.54042: stdout chunk (state=3): >>> # cleanup[3] wiping _socket <<< 11792 1727096119.54047: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback<<< 11792 1727096119.54076: stdout chunk (state=3): >>> # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize<<< 11792 1727096119.54084: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 11792 1727096119.54106: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing<<< 11792 1727096119.54120: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading<<< 11792 1727096119.54139: stdout chunk (state=3): >>> # cleanup[3] wiping weakref<<< 11792 1727096119.54163: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 11792 1727096119.54181: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 11792 1727096119.54200: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 11792 1727096119.54206: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg<<< 11792 1727096119.54341: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11792 1727096119.54612: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11792 1727096119.54640: stdout chunk (state=3): >>># destroy _socket <<< 11792 1727096119.54682: stdout chunk (state=3): >>># destroy _collections <<< 11792 1727096119.54734: stdout chunk (state=3): >>># destroy platform <<< 11792 1727096119.54787: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 11792 1727096119.54835: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib<<< 11792 1727096119.54846: stdout chunk (state=3): >>> # destroy copyreg<<< 11792 1727096119.54911: stdout chunk (state=3): >>> # destroy contextlib # destroy _typing <<< 11792 1727096119.54938: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request<<< 11792 1727096119.54985: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves<<< 11792 1727096119.55043: stdout chunk (state=3): >>> # destroy _frozen_importlib_external # destroy _imp<<< 11792 1727096119.55048: stdout chunk (state=3): >>> # destroy _io # destroy marshal # clear sys.meta_path <<< 11792 1727096119.55063: stdout chunk (state=3): >>># clear sys.modules # destroy _frozen_importlib <<< 11792 1727096119.55232: stdout chunk (state=3): >>># destroy codecs<<< 11792 1727096119.55258: stdout chunk (state=3): >>> # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 11792 1727096119.55280: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11792 1727096119.55347: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib<<< 11792 1727096119.55357: stdout chunk (state=3): >>> # destroy _operator<<< 11792 1727096119.55417: stdout chunk (state=3): >>> # destroy _sre<<< 11792 1727096119.55454: stdout chunk (state=3): >>> # destroy _string # destroy re # destroy itertools # destroy _abc<<< 11792 1727096119.55458: stdout chunk (state=3): >>> # destroy posix # destroy _functools<<< 11792 1727096119.55488: stdout chunk (state=3): >>> # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11792 1727096119.56019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096119.56060: stderr chunk (state=3): >>><<< 11792 1727096119.56062: stdout chunk (state=3): >>><<< 11792 1727096119.56195: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbf0184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbefe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbf01aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbedc9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbedc9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee07dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee07fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee3f800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee3fe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee1faa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee1d1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee04f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee5f6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee5e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee1e060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee06e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee947a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee04200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbee94c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee94b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbee94ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee02d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee955b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee95280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee964b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbeeac680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbeeadd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbeeaebd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbeeaf230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbeeae120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbeeafcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbeeaf3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee96450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebafb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebd8650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebd83b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebd8680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebd8fb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbebd9910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebd8860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebadd60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebdacc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebd97f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbee96ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec07020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec2b410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec881a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec8a900> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec882c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec551c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe52d2e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbec2a210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebdbbf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3cbec2a570> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ds7gs2za/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe58f050> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe56df40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe56d0a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe58cf20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe5be9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe5be750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe5be060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe5be930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbebd8440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe5bf6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe5bf830> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe5bfd70> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe429b20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe42b710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42c0e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42d250> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42fd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbec57dd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42e000> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe437b90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe436660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe4363c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe436930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe42e4e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe47bd70> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe47bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe47d9d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe47d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe47ff20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe47e090> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe483740> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe480110> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe484500> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe484950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe484a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe47c110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe310170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe3115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe486900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe487cb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe486540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe3157c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3165d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3117f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe316360> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe317830> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe322030> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe31f3b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe40aae0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe4fe7b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe322210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe321df0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b6510> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbdfe4170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbdfe44a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3a6c30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b6ff0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b4bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b4860> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbdfe7440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbdfe6cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbdfe6ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbdfe6120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbdfe75c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe0460c0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe0440e0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe3b48c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe047980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe046d80> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe082390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe0721b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbe095e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbe095a90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3cbde2e180> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde2cf20> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde24260> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde2fef0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde74980> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde760f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3cbde75b80> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.46240234375, "5m": 0.3916015625, "15m": 0.1787109375}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 261, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794197504, "block_size": 4096, "block_total": 65519099, "block_available": 63914599, "block_used": 1604500, "inode_total": 131070960, "inode_available": 131029117, "inode_used": 41843, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "19", "epoch": "1727096119", "epoch_int": "1727096119", "date": "2024-09-23", "time": "08:55:19", "iso8601_micro": "2024-09-23T12:55:19.459608Z", "iso8601": "2024-09-23T12:55:19Z", "iso8601_basic": "20240923T085519459608", "iso8601_basic_short": "20240923T085519", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11792 1727096119.57057: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096119.57061: _low_level_execute_command(): starting 11792 1727096119.57063: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096117.807103-11825-119760924985589/ > /dev/null 2>&1 && sleep 0' 11792 1727096119.57184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096119.57227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096119.57230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.57232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096119.57234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096119.57236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.57238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096119.57241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096119.57244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.57303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096119.57306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096119.57372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096119.60109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096119.60136: stderr chunk (state=3): >>><<< 11792 1727096119.60140: stdout chunk (state=3): >>><<< 11792 1727096119.60155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096119.60162: handler run complete 11792 1727096119.60244: variable 'ansible_facts' from source: unknown 11792 1727096119.60311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096119.60514: variable 'ansible_facts' from source: unknown 11792 1727096119.60570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096119.60646: attempt loop complete, returning result 11792 1727096119.60650: _execute() done 11792 1727096119.60654: dumping result to json 11792 1727096119.60675: done dumping result, returning 11792 1727096119.60683: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0afff68d-5257-d9c7-3fc0-000000000015] 11792 1727096119.60686: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000015 11792 1727096119.60943: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000015 11792 1727096119.60945: WORKER PROCESS EXITING ok: [managed_node2] 11792 1727096119.61226: no more pending results, returning what we have 11792 1727096119.61229: results queue empty 11792 1727096119.61230: checking for any_errors_fatal 11792 1727096119.61231: done checking for any_errors_fatal 11792 1727096119.61232: checking for max_fail_percentage 11792 1727096119.61234: done checking for max_fail_percentage 11792 1727096119.61234: checking to see if all hosts have failed and the running result is not ok 11792 1727096119.61235: done checking to see if all hosts have failed 11792 1727096119.61236: getting the remaining hosts for this loop 11792 1727096119.61237: done getting the remaining hosts for this loop 11792 1727096119.61241: getting the next task for host managed_node2 11792 1727096119.61246: done getting next task for host managed_node2 11792 1727096119.61248: ^ task is: TASK: meta (flush_handlers) 11792 1727096119.61250: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096119.61254: getting variables 11792 1727096119.61255: in VariableManager get_vars() 11792 1727096119.61308: Calling all_inventory to load vars for managed_node2 11792 1727096119.61312: Calling groups_inventory to load vars for managed_node2 11792 1727096119.61315: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096119.61325: Calling all_plugins_play to load vars for managed_node2 11792 1727096119.61328: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096119.61331: Calling groups_plugins_play to load vars for managed_node2 11792 1727096119.61526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096119.61739: done with get_vars() 11792 1727096119.61751: done getting variables 11792 1727096119.61809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 11792 1727096119.61879: in VariableManager get_vars() 11792 1727096119.61889: Calling all_inventory to load vars for managed_node2 11792 1727096119.61891: Calling groups_inventory to load vars for managed_node2 11792 1727096119.61893: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096119.61900: Calling all_plugins_play to load vars for managed_node2 11792 1727096119.61903: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096119.61906: Calling groups_plugins_play to load vars for managed_node2 11792 1727096119.62118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096119.62295: done with get_vars() 11792 1727096119.62310: done queuing things up, now waiting for results queue to drain 11792 1727096119.62312: results queue empty 11792 1727096119.62312: checking for any_errors_fatal 11792 1727096119.62314: done checking for any_errors_fatal 11792 1727096119.62315: checking for max_fail_percentage 11792 1727096119.62315: done checking for max_fail_percentage 11792 1727096119.62316: checking to see if all hosts have failed and the running result is not ok 11792 1727096119.62316: done checking to see if all hosts have failed 11792 1727096119.62317: getting the remaining hosts for this loop 11792 1727096119.62317: done getting the remaining hosts for this loop 11792 1727096119.62319: getting the next task for host managed_node2 11792 1727096119.62323: done getting next task for host managed_node2 11792 1727096119.62324: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11792 1727096119.62325: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096119.62327: getting variables 11792 1727096119.62327: in VariableManager get_vars() 11792 1727096119.62333: Calling all_inventory to load vars for managed_node2 11792 1727096119.62334: Calling groups_inventory to load vars for managed_node2 11792 1727096119.62336: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096119.62339: Calling all_plugins_play to load vars for managed_node2 11792 1727096119.62341: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096119.62342: Calling groups_plugins_play to load vars for managed_node2 11792 1727096119.62428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096119.62557: done with get_vars() 11792 1727096119.62563: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:11 Monday 23 September 2024 08:55:19 -0400 (0:00:01.867) 0:00:01.905 ****** 11792 1727096119.62629: entering _queue_task() for managed_node2/include_tasks 11792 1727096119.62631: Creating lock for include_tasks 11792 1727096119.62882: worker is 1 (out of 1 available) 11792 1727096119.62897: exiting _queue_task() for managed_node2/include_tasks 11792 1727096119.62908: done queuing things up, now waiting for results queue to drain 11792 1727096119.62910: waiting for pending results... 11792 1727096119.63053: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 11792 1727096119.63102: in run() - task 0afff68d-5257-d9c7-3fc0-000000000006 11792 1727096119.63114: variable 'ansible_search_path' from source: unknown 11792 1727096119.63174: calling self._execute() 11792 1727096119.63204: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096119.63210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096119.63218: variable 'omit' from source: magic vars 11792 1727096119.63295: _execute() done 11792 1727096119.63299: dumping result to json 11792 1727096119.63302: done dumping result, returning 11792 1727096119.63305: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-d9c7-3fc0-000000000006] 11792 1727096119.63311: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000006 11792 1727096119.63452: no more pending results, returning what we have 11792 1727096119.63458: in VariableManager get_vars() 11792 1727096119.63492: Calling all_inventory to load vars for managed_node2 11792 1727096119.63495: Calling groups_inventory to load vars for managed_node2 11792 1727096119.63498: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096119.63510: Calling all_plugins_play to load vars for managed_node2 11792 1727096119.63512: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096119.63515: Calling groups_plugins_play to load vars for managed_node2 11792 1727096119.63650: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000006 11792 1727096119.63654: WORKER PROCESS EXITING 11792 1727096119.63663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096119.63778: done with get_vars() 11792 1727096119.63785: variable 'ansible_search_path' from source: unknown 11792 1727096119.63799: we have included files to process 11792 1727096119.63800: generating all_blocks data 11792 1727096119.63801: done generating all_blocks data 11792 1727096119.63802: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11792 1727096119.63802: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11792 1727096119.63804: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11792 1727096119.64263: in VariableManager get_vars() 11792 1727096119.64277: done with get_vars() 11792 1727096119.64284: done processing included file 11792 1727096119.64285: iterating over new_blocks loaded from include file 11792 1727096119.64286: in VariableManager get_vars() 11792 1727096119.64291: done with get_vars() 11792 1727096119.64292: filtering new block on tags 11792 1727096119.64301: done filtering new block on tags 11792 1727096119.64303: in VariableManager get_vars() 11792 1727096119.64309: done with get_vars() 11792 1727096119.64310: filtering new block on tags 11792 1727096119.64318: done filtering new block on tags 11792 1727096119.64320: in VariableManager get_vars() 11792 1727096119.64326: done with get_vars() 11792 1727096119.64327: filtering new block on tags 11792 1727096119.64334: done filtering new block on tags 11792 1727096119.64335: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 11792 1727096119.64341: extending task lists for all hosts with included blocks 11792 1727096119.64374: done extending task lists 11792 1727096119.64375: done processing included files 11792 1727096119.64375: results queue empty 11792 1727096119.64376: checking for any_errors_fatal 11792 1727096119.64377: done checking for any_errors_fatal 11792 1727096119.64377: checking for max_fail_percentage 11792 1727096119.64378: done checking for max_fail_percentage 11792 1727096119.64378: checking to see if all hosts have failed and the running result is not ok 11792 1727096119.64379: done checking to see if all hosts have failed 11792 1727096119.64379: getting the remaining hosts for this loop 11792 1727096119.64380: done getting the remaining hosts for this loop 11792 1727096119.64381: getting the next task for host managed_node2 11792 1727096119.64384: done getting next task for host managed_node2 11792 1727096119.64385: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11792 1727096119.64387: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096119.64388: getting variables 11792 1727096119.64388: in VariableManager get_vars() 11792 1727096119.64395: Calling all_inventory to load vars for managed_node2 11792 1727096119.64397: Calling groups_inventory to load vars for managed_node2 11792 1727096119.64398: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096119.64402: Calling all_plugins_play to load vars for managed_node2 11792 1727096119.64403: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096119.64405: Calling groups_plugins_play to load vars for managed_node2 11792 1727096119.64500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096119.64612: done with get_vars() 11792 1727096119.64618: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 08:55:19 -0400 (0:00:00.020) 0:00:01.925 ****** 11792 1727096119.64666: entering _queue_task() for managed_node2/setup 11792 1727096119.64902: worker is 1 (out of 1 available) 11792 1727096119.64915: exiting _queue_task() for managed_node2/setup 11792 1727096119.64927: done queuing things up, now waiting for results queue to drain 11792 1727096119.64929: waiting for pending results... 11792 1727096119.65078: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 11792 1727096119.65135: in run() - task 0afff68d-5257-d9c7-3fc0-000000000026 11792 1727096119.65146: variable 'ansible_search_path' from source: unknown 11792 1727096119.65151: variable 'ansible_search_path' from source: unknown 11792 1727096119.65184: calling self._execute() 11792 1727096119.65237: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096119.65243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096119.65250: variable 'omit' from source: magic vars 11792 1727096119.65629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096119.67100: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096119.67148: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096119.67179: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096119.67214: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096119.67237: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096119.67300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096119.67320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096119.67342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096119.67373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096119.67384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096119.67509: variable 'ansible_facts' from source: unknown 11792 1727096119.67559: variable 'network_test_required_facts' from source: task vars 11792 1727096119.67589: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11792 1727096119.67592: variable 'omit' from source: magic vars 11792 1727096119.67620: variable 'omit' from source: magic vars 11792 1727096119.67644: variable 'omit' from source: magic vars 11792 1727096119.67669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096119.67691: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096119.67705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096119.67718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096119.67727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096119.67748: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096119.67755: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096119.67758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096119.67826: Set connection var ansible_timeout to 10 11792 1727096119.67832: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096119.67840: Set connection var ansible_shell_executable to /bin/sh 11792 1727096119.67845: Set connection var ansible_pipelining to False 11792 1727096119.67848: Set connection var ansible_shell_type to sh 11792 1727096119.67850: Set connection var ansible_connection to ssh 11792 1727096119.67877: variable 'ansible_shell_executable' from source: unknown 11792 1727096119.67880: variable 'ansible_connection' from source: unknown 11792 1727096119.67883: variable 'ansible_module_compression' from source: unknown 11792 1727096119.67885: variable 'ansible_shell_type' from source: unknown 11792 1727096119.67887: variable 'ansible_shell_executable' from source: unknown 11792 1727096119.67889: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096119.67891: variable 'ansible_pipelining' from source: unknown 11792 1727096119.67893: variable 'ansible_timeout' from source: unknown 11792 1727096119.67897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096119.67997: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096119.68005: variable 'omit' from source: magic vars 11792 1727096119.68010: starting attempt loop 11792 1727096119.68012: running the handler 11792 1727096119.68026: _low_level_execute_command(): starting 11792 1727096119.68032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096119.68527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.68560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.68564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.68566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.68618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096119.68621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096119.68623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096119.68692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096119.71299: stdout chunk (state=3): >>>/root <<< 11792 1727096119.71337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096119.71400: stderr chunk (state=3): >>><<< 11792 1727096119.71410: stdout chunk (state=3): >>><<< 11792 1727096119.71447: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096119.71483: _low_level_execute_command(): starting 11792 1727096119.71501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558 `" && echo ansible-tmp-1727096119.7146897-11908-95445472081558="` echo /root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558 `" ) && sleep 0' 11792 1727096119.72194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096119.72211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096119.72230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.72273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.72291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096119.72382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.72404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096119.72424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096119.72459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096119.72558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096119.75572: stdout chunk (state=3): >>>ansible-tmp-1727096119.7146897-11908-95445472081558=/root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558 <<< 11792 1727096119.75792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096119.75796: stdout chunk (state=3): >>><<< 11792 1727096119.75798: stderr chunk (state=3): >>><<< 11792 1727096119.75815: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096119.7146897-11908-95445472081558=/root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096119.75978: variable 'ansible_module_compression' from source: unknown 11792 1727096119.75982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11792 1727096119.76005: variable 'ansible_facts' from source: unknown 11792 1727096119.76229: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/AnsiballZ_setup.py 11792 1727096119.76434: Sending initial data 11792 1727096119.76438: Sent initial data (153 bytes) 11792 1727096119.76929: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.76944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.76969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.77001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096119.77013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096119.77066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096119.79401: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096119.79685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/AnsiballZ_setup.py" <<< 11792 1727096119.79689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpuatdd2h3 /root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/AnsiballZ_setup.py <<< 11792 1727096119.79712: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpuatdd2h3" to remote "/root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/AnsiballZ_setup.py" <<< 11792 1727096119.81652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096119.81754: stderr chunk (state=3): >>><<< 11792 1727096119.81773: stdout chunk (state=3): >>><<< 11792 1727096119.81895: done transferring module to remote 11792 1727096119.81898: _low_level_execute_command(): starting 11792 1727096119.81901: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/ /root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/AnsiballZ_setup.py && sleep 0' 11792 1727096119.82540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096119.82557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096119.82574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.82592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096119.82620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096119.82739: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096119.82974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096119.82990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096119.83203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096119.86108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096119.86112: stdout chunk (state=3): >>><<< 11792 1727096119.86114: stderr chunk (state=3): >>><<< 11792 1727096119.86116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096119.86119: _low_level_execute_command(): starting 11792 1727096119.86121: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/AnsiballZ_setup.py && sleep 0' 11792 1727096119.86995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096119.87017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096119.87058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.87084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096119.87112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096119.87125: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096119.87159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.87217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096119.87223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096119.87309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096119.87359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096119.87420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096119.90336: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 11792 1727096119.90342: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11792 1727096119.90673: stdout chunk (state=3): >>>import 'posix' # <<< 11792 1727096119.90680: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11792 1727096119.90684: stdout chunk (state=3): >>>import 'time' # <<< 11792 1727096119.90730: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00721b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072183b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00721b6a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 11792 1727096119.90734: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11792 1727096119.90810: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11792 1727096119.90839: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11792 1727096119.90889: stdout chunk (state=3): >>>import 'os' # <<< 11792 1727096119.90919: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 11792 1727096119.90961: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11792 1727096119.90982: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071f65130> <<< 11792 1727096119.91039: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11792 1727096119.91064: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071f65fa0> <<< 11792 1727096119.91083: stdout chunk (state=3): >>>import 'site' # <<< 11792 1727096119.91107: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11792 1727096119.91486: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11792 1727096119.91520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11792 1727096119.91545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11792 1727096119.91593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11792 1727096119.91610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11792 1727096119.91637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11792 1727096119.91671: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fa3e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11792 1727096119.91752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fa3f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11792 1727096119.91782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11792 1727096119.91921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fdb890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fdbf20> import '_collections' # <<< 11792 1727096119.91979: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fbbb60> <<< 11792 1727096119.92006: stdout chunk (state=3): >>>import '_functools' # <<< 11792 1727096119.92016: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fb9280> <<< 11792 1727096119.92135: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fa1040> <<< 11792 1727096119.92142: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11792 1727096119.92243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11792 1727096119.92295: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071ffb800> <<< 11792 1727096119.92300: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071ffa420> <<< 11792 1727096119.92325: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fba150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071ff8b60> <<< 11792 1727096119.92517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072030860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fa02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0072030d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072030bc0> <<< 11792 1727096119.92520: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0072030f80> <<< 11792 1727096119.92564: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071f9ede0> <<< 11792 1727096119.92569: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096119.92720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072031610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00720312e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11792 1727096119.92733: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072032510> import 'importlib.util' # <<< 11792 1727096119.92735: stdout chunk (state=3): >>>import 'runpy' # <<< 11792 1727096119.92757: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11792 1727096119.92790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11792 1727096119.92901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072048710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0072049df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11792 1727096119.92953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11792 1727096119.92956: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007204ac90> <<< 11792 1727096119.93014: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007204b2f0> <<< 11792 1727096119.93017: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007204a1e0> <<< 11792 1727096119.93099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11792 1727096119.93118: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007204bd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007204b4a0> <<< 11792 1727096119.93155: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072032540> <<< 11792 1727096119.93176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11792 1727096119.93196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11792 1727096119.93261: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11792 1727096119.93392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d3bc50> <<< 11792 1727096119.93395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11792 1727096119.93419: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d64710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d64470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d64590> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11792 1727096119.93489: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096119.93617: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d65010> <<< 11792 1727096119.93785: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d659d0> <<< 11792 1727096119.93792: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d648c0> <<< 11792 1727096119.93823: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d39df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11792 1727096119.93870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11792 1727096119.93956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d66de0> <<< 11792 1727096119.93964: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d65b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072032c30> <<< 11792 1727096119.93969: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11792 1727096119.94042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11792 1727096119.94315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d93140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071db34d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11792 1727096119.94338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11792 1727096119.94396: stdout chunk (state=3): >>>import 'ntpath' # <<< 11792 1727096119.94424: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071e14200> <<< 11792 1727096119.94438: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11792 1727096119.94463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11792 1727096119.94488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11792 1727096119.94533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11792 1727096119.94624: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071e16960> <<< 11792 1727096119.94698: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071e14320> <<< 11792 1727096119.94751: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071dd9250> <<< 11792 1727096119.94825: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717252e0> <<< 11792 1727096119.94833: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071db22d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d67d40> <<< 11792 1727096119.95143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11792 1727096119.95147: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0071db2630> <<< 11792 1727096119.95386: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_s2f6njbt/ansible_setup_payload.zip' # zipimport: zlib available <<< 11792 1727096119.95520: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.95627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11792 1727096119.95630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11792 1727096119.95673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11792 1727096119.95691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11792 1727096119.95725: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007178ef90> import '_typing' # <<< 11792 1727096119.95902: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007176de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007176cfe0> # zipimport: zlib available <<< 11792 1727096119.95960: stdout chunk (state=3): >>>import 'ansible' # <<< 11792 1727096119.96060: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096119.96063: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 11792 1727096119.97491: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096119.98683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007178ce30> <<< 11792 1727096119.98713: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11792 1727096119.98799: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11792 1727096119.98808: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00717be960> <<< 11792 1727096119.98978: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717be6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717be000> <<< 11792 1727096119.98982: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11792 1727096119.98984: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717be4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007178f9b0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00717bf6e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00717bf920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11792 1727096119.99031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11792 1727096119.99055: stdout chunk (state=3): >>>import '_locale' # <<< 11792 1727096119.99087: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717bfe30> <<< 11792 1727096119.99111: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11792 1727096119.99143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11792 1727096119.99179: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071629ca0> <<< 11792 1727096119.99302: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007162b8c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162c2c0> <<< 11792 1727096119.99305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11792 1727096119.99336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11792 1727096119.99374: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162d460> <<< 11792 1727096119.99377: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11792 1727096119.99406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11792 1727096119.99482: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11792 1727096119.99513: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162fef0> <<< 11792 1727096119.99546: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071f9eed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162e1b0> <<< 11792 1727096119.99619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11792 1727096119.99633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11792 1727096119.99811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11792 1727096119.99834: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071637f20> import '_tokenize' # <<< 11792 1727096119.99881: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00716369f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071636750> <<< 11792 1727096119.99929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11792 1727096120.00035: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071636cc0> <<< 11792 1727096120.00072: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007167c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007167c350> <<< 11792 1727096120.00142: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11792 1727096120.00148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 11792 1727096120.00251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007167ddc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007167db80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11792 1727096120.00288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00716802c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007167e450> <<< 11792 1727096120.00300: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11792 1727096120.00369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.00488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11792 1727096120.00491: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071683aa0> <<< 11792 1727096120.00547: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071680470> <<< 11792 1727096120.00614: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071684830> <<< 11792 1727096120.00644: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071684a70> <<< 11792 1727096120.00696: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00716843e0> <<< 11792 1727096120.00799: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007167c500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.00826: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00715104d0> <<< 11792 1727096120.00987: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071511a90> <<< 11792 1727096120.01031: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071686c30> <<< 11792 1727096120.01140: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071687fb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071686840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11792 1727096120.01144: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.01271: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.01303: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 11792 1727096120.01321: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.01431: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.01553: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.02120: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.02694: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11792 1727096120.02919: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071515b80> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715169f0> <<< 11792 1727096120.02922: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071511a30> <<< 11792 1727096120.02971: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11792 1727096120.03000: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.03030: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11792 1727096120.03182: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.03353: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11792 1727096120.03376: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071516570> # zipimport: zlib available <<< 11792 1727096120.03845: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.04297: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.04496: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.04532: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11792 1727096120.04546: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.04613: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.04692: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11792 1727096120.04721: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.04744: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11792 1727096120.04785: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.04831: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11792 1727096120.04842: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.05075: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.05314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11792 1727096120.05410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 11792 1727096120.05487: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071517a10> # zipimport: zlib available <<< 11792 1727096120.05564: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.05632: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 11792 1727096120.05655: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11792 1727096120.05719: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.05722: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.05806: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 11792 1727096120.05820: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.05890: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.05923: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.05993: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11792 1727096120.06040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.06142: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00715222d0> <<< 11792 1727096120.06258: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007151d1c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11792 1727096120.06295: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.06363: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.06385: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.06433: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.06460: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11792 1727096120.06483: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11792 1727096120.06696: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11792 1727096120.06714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007160aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717ea750> <<< 11792 1727096120.06808: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00716854f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00716366c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11792 1727096120.06832: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.06854: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.06882: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11792 1727096120.06987: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11792 1727096120.06990: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 11792 1727096120.06993: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07047: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07114: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07154: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07157: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07195: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07235: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07407: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.07475: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07538: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # <<< 11792 1727096120.07548: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07726: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07898: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07945: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.07999: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.08208: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11792 1727096120.08212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b2840> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11792 1727096120.08238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11792 1727096120.08266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 11792 1727096120.08287: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071168170> <<< 11792 1727096120.08318: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00711684d0> <<< 11792 1727096120.08384: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715a7410> <<< 11792 1727096120.08424: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b3380> <<< 11792 1727096120.08431: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b0f20> <<< 11792 1727096120.08456: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b0a40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11792 1727096120.08514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11792 1727096120.08548: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11792 1727096120.08651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11792 1727096120.08676: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007116b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007116acf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007116aed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007116a120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11792 1727096120.08819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11792 1727096120.08890: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007116b620> <<< 11792 1727096120.08899: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11792 1727096120.08952: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00711ca150> <<< 11792 1727096120.08990: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00711c8170> <<< 11792 1727096120.09102: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b0c20> import 'ansible.module_utils.facts.timeout' # <<< 11792 1727096120.09106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 11792 1727096120.09116: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.09166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11792 1727096120.09188: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.09231: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.09289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11792 1727096120.09513: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11792 1727096120.09516: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.09519: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 11792 1727096120.09552: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.09592: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11792 1727096120.09616: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.09661: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.09744: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.09852: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11792 1727096120.10347: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.11074: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 11792 1727096120.11109: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.11146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 11792 1727096120.11166: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.11236: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.11331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 11792 1727096120.11373: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.11413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 11792 1727096120.11461: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.11504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11792 1727096120.11586: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.11644: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.11782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 11792 1727096120.11785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11792 1727096120.11814: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00711ca300> <<< 11792 1727096120.11853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11792 1727096120.11888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11792 1727096120.12085: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00711caed0> <<< 11792 1727096120.12287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.12428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11792 1727096120.12446: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.12464: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.12593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11792 1727096120.12609: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.12715: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.12823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 11792 1727096120.12940: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.13175: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11792 1727096120.13212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.13228: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00712064b0> <<< 11792 1727096120.13534: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00711f62d0> <<< 11792 1727096120.13559: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 11792 1727096120.13654: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.13706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 11792 1727096120.13888: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.15394: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007121a2d0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071219ee0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 11792 1727096120.15594: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.15769: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.15881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11792 1727096120.15895: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.15991: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.16557: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.17092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 11792 1727096120.17106: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.17211: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.17319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 11792 1727096120.17426: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.17535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11792 1727096120.17547: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.17994: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.18091: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 11792 1727096120.18134: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.18256: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.18602: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.18898: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.19144: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.19205: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 11792 1727096120.19278: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available<<< 11792 1727096120.19352: stdout chunk (state=3): >>> <<< 11792 1727096120.19419: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.19534: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11792 1727096120.19548: stdout chunk (state=3): >>> <<< 11792 1727096120.19672: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 11792 1727096120.19721: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.19965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.20075: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 11792 1727096120.20658: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.20807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 11792 1727096120.20826: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.20989: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.21022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 11792 1727096120.21039: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.21072: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.21137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 11792 1727096120.21151: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.21189: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 11792 1727096120.21274: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.21371: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11792 1727096120.21435: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.21438: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 11792 1727096120.21448: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.21574: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.21578: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.21605: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.22173: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.22202: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.22324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11792 1727096120.22341: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.22378: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.22425: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11792 1727096120.22463: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.22962: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.23081: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11792 1727096120.23109: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11792 1727096120.23128: stdout chunk (state=3): >>> <<< 11792 1727096120.23245: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.24255: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 11792 1727096120.24336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11792 1727096120.24361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc'<<< 11792 1727096120.24419: stdout chunk (state=3): >>> # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.24443: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.24499: stdout chunk (state=3): >>>import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071016720> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071014f50> <<< 11792 1727096120.24614: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007100f470><<< 11792 1727096120.24638: stdout chunk (state=3): >>> <<< 11792 1727096120.25705: stdout chunk (state=3): >>> <<< 11792 1727096120.25709: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "20", "epoch": "1727096120", "epoch_int": "1727096120", "date": "2024-09-23", "time": "08:55:20", "iso8601_micro": "2024-09-23T12:55:20.234810Z", "iso8601": "2024-09-23T12:55:20Z", "iso8601_basic": "20240923T085520234810", "iso8601_basic_short": "20240923T085520", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root"<<< 11792 1727096120.25918: stdout chunk (state=3): >>>, "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11792 1727096120.26775: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 11792 1727096120.26807: stdout chunk (state=3): >>> # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix<<< 11792 1727096120.26826: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io<<< 11792 1727096120.26849: stdout chunk (state=3): >>> # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site <<< 11792 1727096120.26872: stdout chunk (state=3): >>># destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections<<< 11792 1727096120.26902: stdout chunk (state=3): >>> # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser<<< 11792 1727096120.26919: stdout chunk (state=3): >>> # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib<<< 11792 1727096120.26965: stdout chunk (state=3): >>> # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib <<< 11792 1727096120.27274: stdout chunk (state=3): >>># cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil<<< 11792 1727096120.27278: stdout chunk (state=3): >>> # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat<<< 11792 1727096120.27281: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc <<< 11792 1727096120.27283: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections <<< 11792 1727096120.27285: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 11792 1727096120.27287: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4<<< 11792 1727096120.27294: stdout chunk (state=3): >>> # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils<<< 11792 1727096120.27297: stdout chunk (state=3): >>> # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle<<< 11792 1727096120.27342: stdout chunk (state=3): >>> # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline<<< 11792 1727096120.27370: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb <<< 11792 1727096120.27460: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd<<< 11792 1727096120.27473: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd <<< 11792 1727096120.27496: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips<<< 11792 1727096120.27522: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base<<< 11792 1727096120.27562: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd<<< 11792 1727096120.27585: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual<<< 11792 1727096120.27734: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 11792 1727096120.28389: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11792 1727096120.28411: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11792 1727096120.28677: stdout chunk (state=3): >>># destroy _bz2 <<< 11792 1727096120.28687: stdout chunk (state=3): >>># destroy _compression <<< 11792 1727096120.28746: stdout chunk (state=3): >>># destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner<<< 11792 1727096120.28766: stdout chunk (state=3): >>> # destroy _json<<< 11792 1727096120.28794: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale<<< 11792 1727096120.28820: stdout chunk (state=3): >>> # destroy locale<<< 11792 1727096120.28844: stdout chunk (state=3): >>> # destroy select # destroy _signal<<< 11792 1727096120.28876: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog <<< 11792 1727096120.28958: stdout chunk (state=3): >>># destroy uuid # destroy selinux <<< 11792 1727096120.29073: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro<<< 11792 1727096120.29084: stdout chunk (state=3): >>> # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors<<< 11792 1727096120.29116: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing <<< 11792 1727096120.29171: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle<<< 11792 1727096120.29493: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob<<< 11792 1727096120.29673: stdout chunk (state=3): >>> # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout <<< 11792 1727096120.29676: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux<<< 11792 1727096120.29679: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes<<< 11792 1727096120.29899: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 11792 1727096120.29922: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 11792 1727096120.29934: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp<<< 11792 1727096120.29964: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 11792 1727096120.29987: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon<<< 11792 1727096120.30136: stdout chunk (state=3): >>> # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11792 1727096120.30317: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11792 1727096120.30342: stdout chunk (state=3): >>># destroy _socket <<< 11792 1727096120.30432: stdout chunk (state=3): >>># destroy _collections <<< 11792 1727096120.30436: stdout chunk (state=3): >>># destroy platform<<< 11792 1727096120.30450: stdout chunk (state=3): >>> # destroy _uuid <<< 11792 1727096120.30646: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 11792 1727096120.30654: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 11792 1727096120.30693: stdout chunk (state=3): >>> # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external <<< 11792 1727096120.30758: stdout chunk (state=3): >>># destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules<<< 11792 1727096120.30771: stdout chunk (state=3): >>> <<< 11792 1727096120.30841: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 11792 1727096120.30918: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 11792 1727096120.31020: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 11792 1727096120.31045: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 11792 1727096120.31225: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11792 1727096120.31789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096120.31793: stdout chunk (state=3): >>><<< 11792 1727096120.31978: stderr chunk (state=3): >>><<< 11792 1727096120.32092: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00721b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072183b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00721b6a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071f65130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071f65fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fa3e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fa3f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fdb890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fdbf20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fbbb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fb9280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fa1040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071ffb800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071ffa420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fba150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071ff8b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072030860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071fa02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0072030d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072030bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0072030f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071f9ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072031610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00720312e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072032510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072048710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0072049df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007204ac90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007204b2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007204a1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007204bd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007204b4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072032540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d3bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d64710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d64470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d64590> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d65010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071d659d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d648c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d39df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d66de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d65b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0072032c30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d93140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071db34d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071e14200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071e16960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071e14320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071dd9250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717252e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071db22d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071d67d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0071db2630> # zipimport: found 103 names in '/tmp/ansible_setup_payload_s2f6njbt/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007178ef90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007176de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007176cfe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007178ce30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00717be960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717be6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717be000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717be4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007178f9b0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00717bf6e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00717bf920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717bfe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071629ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007162b8c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162c2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162d460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162fef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071f9eed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071637f20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00716369f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071636750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071636cc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007162e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007167c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007167c350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007167ddc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007167db80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00716802c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007167e450> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071683aa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071680470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071684830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071684a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00716843e0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007167c500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00715104d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071511a90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071686c30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071687fb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071686840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071515b80> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715169f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071511a30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071516570> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071517a10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00715222d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007151d1c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007160aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00717ea750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00716854f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00716366c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b2840> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071168170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00711684d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715a7410> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b3380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b0f20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b0a40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007116b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007116acf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007116aed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007116a120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007116b620> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00711ca150> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00711c8170> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00715b0c20> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00711ca300> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00711caed0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f00712064b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f00711f62d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f007121a2d0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071219ee0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0071016720> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0071014f50> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f007100f470> {"ansible_facts": {"ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "20", "epoch": "1727096120", "epoch_int": "1727096120", "date": "2024-09-23", "time": "08:55:20", "iso8601_micro": "2024-09-23T12:55:20.234810Z", "iso8601": "2024-09-23T12:55:20Z", "iso8601_basic": "20240923T085520234810", "iso8601_basic_short": "20240923T085520", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11792 1727096120.33571: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096120.33575: _low_level_execute_command(): starting 11792 1727096120.33578: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096119.7146897-11908-95445472081558/ > /dev/null 2>&1 && sleep 0' 11792 1727096120.33580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096120.33583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096120.33585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.33587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096120.33589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096120.33591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096120.33593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096120.36375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096120.36379: stdout chunk (state=3): >>><<< 11792 1727096120.36381: stderr chunk (state=3): >>><<< 11792 1727096120.36384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096120.36386: handler run complete 11792 1727096120.36427: variable 'ansible_facts' from source: unknown 11792 1727096120.36493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096120.36616: variable 'ansible_facts' from source: unknown 11792 1727096120.36663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096120.36727: attempt loop complete, returning result 11792 1727096120.36731: _execute() done 11792 1727096120.36733: dumping result to json 11792 1727096120.36772: done dumping result, returning 11792 1727096120.36776: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-d9c7-3fc0-000000000026] 11792 1727096120.36778: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000026 11792 1727096120.37073: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000026 11792 1727096120.37077: WORKER PROCESS EXITING ok: [managed_node2] 11792 1727096120.37401: no more pending results, returning what we have 11792 1727096120.37405: results queue empty 11792 1727096120.37406: checking for any_errors_fatal 11792 1727096120.37407: done checking for any_errors_fatal 11792 1727096120.37408: checking for max_fail_percentage 11792 1727096120.37410: done checking for max_fail_percentage 11792 1727096120.37410: checking to see if all hosts have failed and the running result is not ok 11792 1727096120.37411: done checking to see if all hosts have failed 11792 1727096120.37412: getting the remaining hosts for this loop 11792 1727096120.37413: done getting the remaining hosts for this loop 11792 1727096120.37417: getting the next task for host managed_node2 11792 1727096120.37452: done getting next task for host managed_node2 11792 1727096120.37455: ^ task is: TASK: Check if system is ostree 11792 1727096120.37457: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096120.37461: getting variables 11792 1727096120.37462: in VariableManager get_vars() 11792 1727096120.37516: Calling all_inventory to load vars for managed_node2 11792 1727096120.37519: Calling groups_inventory to load vars for managed_node2 11792 1727096120.37523: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096120.37542: Calling all_plugins_play to load vars for managed_node2 11792 1727096120.37546: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096120.37554: Calling groups_plugins_play to load vars for managed_node2 11792 1727096120.38089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096120.38488: done with get_vars() 11792 1727096120.38501: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 08:55:20 -0400 (0:00:00.741) 0:00:02.667 ****** 11792 1727096120.38784: entering _queue_task() for managed_node2/stat 11792 1727096120.39146: worker is 1 (out of 1 available) 11792 1727096120.39162: exiting _queue_task() for managed_node2/stat 11792 1727096120.39178: done queuing things up, now waiting for results queue to drain 11792 1727096120.39180: waiting for pending results... 11792 1727096120.39434: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 11792 1727096120.39539: in run() - task 0afff68d-5257-d9c7-3fc0-000000000028 11792 1727096120.39563: variable 'ansible_search_path' from source: unknown 11792 1727096120.39572: variable 'ansible_search_path' from source: unknown 11792 1727096120.39616: calling self._execute() 11792 1727096120.39773: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096120.39776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096120.39778: variable 'omit' from source: magic vars 11792 1727096120.40217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096120.40495: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096120.40544: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096120.40595: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096120.40698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096120.41004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096120.41221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096120.41225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096120.41227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096120.41281: Evaluated conditional (not __network_is_ostree is defined): True 11792 1727096120.41309: variable 'omit' from source: magic vars 11792 1727096120.41360: variable 'omit' from source: magic vars 11792 1727096120.41408: variable 'omit' from source: magic vars 11792 1727096120.41445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096120.41482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096120.41508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096120.41530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096120.41551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096120.41587: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096120.41597: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096120.41605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096120.41712: Set connection var ansible_timeout to 10 11792 1727096120.41726: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096120.41738: Set connection var ansible_shell_executable to /bin/sh 11792 1727096120.41747: Set connection var ansible_pipelining to False 11792 1727096120.41756: Set connection var ansible_shell_type to sh 11792 1727096120.41767: Set connection var ansible_connection to ssh 11792 1727096120.41796: variable 'ansible_shell_executable' from source: unknown 11792 1727096120.41810: variable 'ansible_connection' from source: unknown 11792 1727096120.41818: variable 'ansible_module_compression' from source: unknown 11792 1727096120.41826: variable 'ansible_shell_type' from source: unknown 11792 1727096120.41832: variable 'ansible_shell_executable' from source: unknown 11792 1727096120.41839: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096120.41847: variable 'ansible_pipelining' from source: unknown 11792 1727096120.41857: variable 'ansible_timeout' from source: unknown 11792 1727096120.41864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096120.42018: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096120.42032: variable 'omit' from source: magic vars 11792 1727096120.42041: starting attempt loop 11792 1727096120.42047: running the handler 11792 1727096120.42066: _low_level_execute_command(): starting 11792 1727096120.42081: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096120.42945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096120.43061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096120.43087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.43107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096120.43121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096120.43143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096120.43219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096120.45981: stdout chunk (state=3): >>>/root <<< 11792 1727096120.46175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096120.46179: stdout chunk (state=3): >>><<< 11792 1727096120.46181: stderr chunk (state=3): >>><<< 11792 1727096120.46186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096120.46196: _low_level_execute_command(): starting 11792 1727096120.46199: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635 `" && echo ansible-tmp-1727096120.4608135-11943-127584484573635="` echo /root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635 `" ) && sleep 0' 11792 1727096120.46967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096120.46987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096120.47001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096120.47020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096120.47047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.47084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096120.47158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.47181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096120.47200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096120.47214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096120.47296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096120.50160: stdout chunk (state=3): >>>ansible-tmp-1727096120.4608135-11943-127584484573635=/root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635 <<< 11792 1727096120.50573: stdout chunk (state=3): >>><<< 11792 1727096120.50577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096120.50579: stderr chunk (state=3): >>><<< 11792 1727096120.50582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096120.4608135-11943-127584484573635=/root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096120.50584: variable 'ansible_module_compression' from source: unknown 11792 1727096120.50586: ANSIBALLZ: Using lock for stat 11792 1727096120.50588: ANSIBALLZ: Acquiring lock 11792 1727096120.50590: ANSIBALLZ: Lock acquired: 139635227776000 11792 1727096120.50593: ANSIBALLZ: Creating module 11792 1727096120.62930: ANSIBALLZ: Writing module into payload 11792 1727096120.63045: ANSIBALLZ: Writing module 11792 1727096120.63076: ANSIBALLZ: Renaming module 11792 1727096120.63096: ANSIBALLZ: Done creating module 11792 1727096120.63121: variable 'ansible_facts' from source: unknown 11792 1727096120.63212: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/AnsiballZ_stat.py 11792 1727096120.63339: Sending initial data 11792 1727096120.63349: Sent initial data (153 bytes) 11792 1727096120.63800: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096120.63814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096120.63827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.63871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096120.63888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096120.63906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096120.63946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096120.66261: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096120.66296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096120.66331: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpv4hy4k80 /root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/AnsiballZ_stat.py <<< 11792 1727096120.66341: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/AnsiballZ_stat.py" <<< 11792 1727096120.66369: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpv4hy4k80" to remote "/root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/AnsiballZ_stat.py" <<< 11792 1727096120.66375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/AnsiballZ_stat.py" <<< 11792 1727096120.66895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096120.66942: stderr chunk (state=3): >>><<< 11792 1727096120.66946: stdout chunk (state=3): >>><<< 11792 1727096120.66976: done transferring module to remote 11792 1727096120.66988: _low_level_execute_command(): starting 11792 1727096120.66993: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/ /root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/AnsiballZ_stat.py && sleep 0' 11792 1727096120.67436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096120.67474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096120.67478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.67480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096120.67482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.67521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096120.67533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096120.67583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096120.70180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096120.70208: stderr chunk (state=3): >>><<< 11792 1727096120.70211: stdout chunk (state=3): >>><<< 11792 1727096120.70226: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096120.70229: _low_level_execute_command(): starting 11792 1727096120.70235: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/AnsiballZ_stat.py && sleep 0' 11792 1727096120.70736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096120.70739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.70742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096120.70744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096120.70795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096120.70798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096120.70800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096120.70858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096120.74040: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11792 1727096120.74089: stdout chunk (state=3): >>>import _imp # builtin <<< 11792 1727096120.74112: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 11792 1727096120.74117: stdout chunk (state=3): >>>import '_weakref' # <<< 11792 1727096120.74244: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 11792 1727096120.74282: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11792 1727096120.74312: stdout chunk (state=3): >>>import 'time' # <<< 11792 1727096120.74316: stdout chunk (state=3): >>>import 'zipimport' # <<< 11792 1727096120.74333: stdout chunk (state=3): >>># installed zipimport hook <<< 11792 1727096120.74420: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 11792 1727096120.74444: stdout chunk (state=3): >>>import 'codecs' # <<< 11792 1727096120.74510: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11792 1727096120.74550: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504be7b30> <<< 11792 1727096120.74554: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 11792 1727096120.74580: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504c1aa50> import '_signal' # <<< 11792 1727096120.74608: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 11792 1727096120.75003: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 11792 1727096120.75006: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a2d130> <<< 11792 1727096120.75042: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.75085: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a2dfa0> import 'site' # <<< 11792 1727096120.75116: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11792 1727096120.75477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11792 1727096120.75549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.75607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11792 1727096120.75630: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11792 1727096120.75785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a6bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a6bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11792 1727096120.75799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11792 1727096120.75898: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.75947: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 11792 1727096120.75977: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504aa3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504aa3ec0> <<< 11792 1727096120.76050: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a83b60> <<< 11792 1727096120.76064: stdout chunk (state=3): >>>import '_functools' # <<< 11792 1727096120.76099: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a812b0> <<< 11792 1727096120.76216: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a69070> <<< 11792 1727096120.76244: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11792 1727096120.76276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 11792 1727096120.76348: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11792 1727096120.76434: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11792 1727096120.76455: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504ac37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504ac23f0> <<< 11792 1727096120.76513: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504ac0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11792 1727096120.76562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504af8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11792 1727096120.76613: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504af8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504af8bf0> <<< 11792 1727096120.76629: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504af8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a66e10> <<< 11792 1727096120.76679: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11792 1727096120.76797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504af9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504af9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504afa540> <<< 11792 1727096120.76800: stdout chunk (state=3): >>>import 'importlib.util' # <<< 11792 1727096120.76892: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11792 1727096120.76896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11792 1727096120.76914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 11792 1727096120.76928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504b10740> import 'errno' # <<< 11792 1727096120.76958: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504b11e20> <<< 11792 1727096120.76991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11792 1727096120.77083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11792 1727096120.77124: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504b12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504b132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504b12210> <<< 11792 1727096120.77128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11792 1727096120.77179: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504b13d70> <<< 11792 1727096120.77217: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504b134a0> <<< 11792 1727096120.77283: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504afa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11792 1727096120.77308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11792 1727096120.77450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11792 1727096120.77454: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048d3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048fc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048fc470> <<< 11792 1727096120.77494: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048fc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 11792 1727096120.77517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11792 1727096120.77655: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.77778: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048fd070> <<< 11792 1727096120.77975: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048fda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048fc920> <<< 11792 1727096120.78003: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048d1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11792 1727096120.78060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11792 1727096120.78085: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 11792 1727096120.78311: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048fee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048fdb50> <<< 11792 1727096120.78315: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504afac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11792 1727096120.78410: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15049271a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11792 1727096120.78439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.78474: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11792 1727096120.78488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11792 1727096120.78540: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150494b560> <<< 11792 1727096120.78564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11792 1727096120.78613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11792 1727096120.78731: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15049ac2c0> <<< 11792 1727096120.78761: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11792 1727096120.78814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11792 1727096120.78876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11792 1727096120.79001: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15049aea20> <<< 11792 1727096120.79187: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15049ac3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150496d2b0> <<< 11792 1727096120.79241: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15047ad3d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150494a360> <<< 11792 1727096120.79245: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048ffd70> <<< 11792 1727096120.79396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11792 1727096120.79487: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f15047ad670> <<< 11792 1727096120.79742: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_zcgh_k0l/ansible_stat_payload.zip' # zipimport: zlib available <<< 11792 1727096120.79827: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.79963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11792 1727096120.79966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11792 1727096120.80108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11792 1727096120.80138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504803170> import '_typing' # <<< 11792 1727096120.80433: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15047e2060> <<< 11792 1727096120.80458: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15047e11f0> # zipimport: zlib available <<< 11792 1727096120.80556: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.80569: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.80737: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 11792 1727096120.82846: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.84786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 11792 1727096120.84793: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504801040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 11792 1727096120.84818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.84845: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11792 1727096120.84880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 11792 1727096120.84892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11792 1727096120.84938: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.84954: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150482aae0> <<< 11792 1727096120.85007: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150482a870> <<< 11792 1727096120.85057: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150482a180> <<< 11792 1727096120.85086: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 11792 1727096120.85114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11792 1727096120.85244: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150482abd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504803e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150482b860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150482b9e0> <<< 11792 1727096120.85273: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11792 1727096120.85358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11792 1727096120.85377: stdout chunk (state=3): >>>import '_locale' # <<< 11792 1727096120.85441: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150482bef0> <<< 11792 1727096120.85473: stdout chunk (state=3): >>>import 'pwd' # <<< 11792 1727096120.85492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11792 1727096120.85546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11792 1727096120.85593: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150410dc70> <<< 11792 1727096120.85630: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.85844: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150410f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504110290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504111430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11792 1727096120.85877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11792 1727096120.85912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 11792 1727096120.85915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11792 1727096120.85996: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504113f20> <<< 11792 1727096120.86045: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041185c0> <<< 11792 1727096120.86093: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041121e0> <<< 11792 1727096120.86106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11792 1727096120.86160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11792 1727096120.86196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11792 1727096120.86217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11792 1727096120.86304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11792 1727096120.86340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150411bec0> import '_tokenize' # <<< 11792 1727096120.86446: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150411a990> <<< 11792 1727096120.86463: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150411a6f0> <<< 11792 1727096120.86501: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 11792 1727096120.86504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11792 1727096120.86615: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150411ac60> <<< 11792 1727096120.86716: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504112660> <<< 11792 1727096120.86734: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.86853: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504163ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11792 1727096120.86885: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504165c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504165a00> <<< 11792 1727096120.86915: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11792 1727096120.87110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 11792 1727096120.87122: stdout chunk (state=3): >>> <<< 11792 1727096120.87190: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.87207: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041681a0> <<< 11792 1727096120.87221: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504166300> <<< 11792 1727096120.87263: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11792 1727096120.87334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.87359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 11792 1727096120.87395: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11792 1727096120.87398: stdout chunk (state=3): >>>import '_string' # <<< 11792 1727096120.87480: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150416b950> <<< 11792 1727096120.87690: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504168350> <<< 11792 1727096120.87798: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.87802: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.87808: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150416ca10> <<< 11792 1727096120.87850: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.87920: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150416cb60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.87937: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.87940: stdout chunk (state=3): >>> import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150416cb30> <<< 11792 1727096120.87970: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041642c0> <<< 11792 1727096120.88002: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 11792 1727096120.88043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11792 1727096120.88080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11792 1727096120.88163: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.88170: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041f43b0> <<< 11792 1727096120.88399: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.88416: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.88422: stdout chunk (state=3): >>> <<< 11792 1727096120.88443: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041f55e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150416eb40><<< 11792 1727096120.88448: stdout chunk (state=3): >>> <<< 11792 1727096120.88480: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.88495: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150416fef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150416e7b0><<< 11792 1727096120.88522: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11792 1727096120.88549: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.88587: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11792 1727096120.88743: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.88888: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.88924: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 11792 1727096120.88942: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.88964: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # <<< 11792 1727096120.88986: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11792 1727096120.89359: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.89383: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.90267: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.90273: stdout chunk (state=3): >>> <<< 11792 1727096120.91183: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 11792 1727096120.91215: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 11792 1727096120.91218: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 11792 1727096120.91225: stdout chunk (state=3): >>> <<< 11792 1727096120.91246: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 11792 1727096120.91286: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 11792 1727096120.91290: stdout chunk (state=3): >>> <<< 11792 1727096120.91324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 11792 1727096120.91329: stdout chunk (state=3): >>> <<< 11792 1727096120.91403: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.91426: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 11792 1727096120.91437: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041fd7c0> <<< 11792 1727096120.91576: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 11792 1727096120.91592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11792 1727096120.91612: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041fe570> <<< 11792 1727096120.91701: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041f5730> import 'ansible.module_utils.compat.selinux' # <<< 11792 1727096120.91720: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.91798: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.91814: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11792 1727096120.92137: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.92337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 11792 1727096120.92356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041fe210> <<< 11792 1727096120.92380: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.93145: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.93895: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.94013: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11792 1727096120.94138: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11792 1727096120.94165: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.94222: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.94231: stdout chunk (state=3): >>> <<< 11792 1727096120.94289: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11792 1727096120.94304: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.94311: stdout chunk (state=3): >>> <<< 11792 1727096120.94422: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.94559: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11792 1727096120.94608: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.94612: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.94637: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available<<< 11792 1727096120.94646: stdout chunk (state=3): >>> <<< 11792 1727096120.94699: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.94747: stdout chunk (state=3): >>> import 'ansible.module_utils.parsing.convert_bool' # <<< 11792 1727096120.94781: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.95147: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.95155: stdout chunk (state=3): >>> <<< 11792 1727096120.95532: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 11792 1727096120.95633: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11792 1727096120.95679: stdout chunk (state=3): >>>import '_ast' # <<< 11792 1727096120.95794: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041ff620> <<< 11792 1727096120.95805: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.95933: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.96029: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 11792 1727096120.96042: stdout chunk (state=3): >>> import 'ansible.module_utils.common.validation' # <<< 11792 1727096120.96065: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 11792 1727096120.96084: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 11792 1727096120.96119: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11792 1727096120.96189: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.96247: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11792 1727096120.96271: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11792 1727096120.96397: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11792 1727096120.96486: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.96593: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11792 1727096120.96671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.96798: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 11792 1727096120.96808: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150400a090> <<< 11792 1727096120.96906: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504007d10> import 'ansible.module_utils.common.file' # <<< 11792 1727096120.96913: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 11792 1727096120.97024: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 11792 1727096120.97031: stdout chunk (state=3): >>> <<< 11792 1727096120.97125: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.97130: stdout chunk (state=3): >>> <<< 11792 1727096120.97175: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.97181: stdout chunk (state=3): >>> <<< 11792 1727096120.97246: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 11792 1727096120.97252: stdout chunk (state=3): >>> <<< 11792 1727096120.97264: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11792 1727096120.97294: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 11792 1727096120.97298: stdout chunk (state=3): >>> <<< 11792 1727096120.97335: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 11792 1727096120.97341: stdout chunk (state=3): >>> <<< 11792 1727096120.97374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11792 1727096120.97493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 11792 1727096120.97498: stdout chunk (state=3): >>> <<< 11792 1727096120.97526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 11792 1727096120.97621: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048867e0> <<< 11792 1727096120.97693: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048724b0> <<< 11792 1727096120.97809: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041fc770> <<< 11792 1727096120.97837: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15040004d0> <<< 11792 1727096120.97845: stdout chunk (state=3): >>># destroy ansible.module_utils.distro <<< 11792 1727096120.97862: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 11792 1727096120.97883: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.97887: stdout chunk (state=3): >>> <<< 11792 1727096120.97941: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.97958: stdout chunk (state=3): >>> <<< 11792 1727096120.98003: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 11792 1727096120.98079: stdout chunk (state=3): >>> import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 11792 1727096120.98099: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11792 1727096120.98124: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 11792 1727096120.98152: stdout chunk (state=3): >>> <<< 11792 1727096120.98170: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.98395: stdout chunk (state=3): >>># zipimport: zlib available <<< 11792 1727096120.98694: stdout chunk (state=3): >>># zipimport: zlib available<<< 11792 1727096120.98848: stdout chunk (state=3): >>> <<< 11792 1727096120.98855: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11792 1727096120.98892: stdout chunk (state=3): >>># destroy __main__ <<< 11792 1727096120.99373: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 11792 1727096120.99399: stdout chunk (state=3): >>> <<< 11792 1727096120.99418: stdout chunk (state=3): >>># clear sys.path_hooks <<< 11792 1727096120.99448: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1<<< 11792 1727096120.99459: stdout chunk (state=3): >>> # clear sys.ps2<<< 11792 1727096120.99474: stdout chunk (state=3): >>> # clear sys.last_exc # clear sys.last_type<<< 11792 1727096120.99490: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback<<< 11792 1727096120.99657: stdout chunk (state=3): >>> # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] re<<< 11792 1727096120.99686: stdout chunk (state=3): >>>moving locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.m<<< 11792 1727096120.99693: stdout chunk (state=3): >>>odules # destroy ansible.modules <<< 11792 1727096121.00053: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11792 1727096121.00083: stdout chunk (state=3): >>># destroy importlib.machinery <<< 11792 1727096121.00090: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util <<< 11792 1727096121.00124: stdout chunk (state=3): >>># destroy _bz2 <<< 11792 1727096121.00141: stdout chunk (state=3): >>># destroy _compression <<< 11792 1727096121.00146: stdout chunk (state=3): >>># destroy _lzma <<< 11792 1727096121.00172: stdout chunk (state=3): >>># destroy _blake2 <<< 11792 1727096121.00176: stdout chunk (state=3): >>># destroy binascii<<< 11792 1727096121.00197: stdout chunk (state=3): >>> # destroy struct # destroy zlib<<< 11792 1727096121.00221: stdout chunk (state=3): >>> # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob<<< 11792 1727096121.00237: stdout chunk (state=3): >>> # destroy fnmatch # destroy ipaddress<<< 11792 1727096121.00278: stdout chunk (state=3): >>> # destroy ntpath<<< 11792 1727096121.00289: stdout chunk (state=3): >>> <<< 11792 1727096121.00303: stdout chunk (state=3): >>># destroy importlib # destroy zipimport<<< 11792 1727096121.00322: stdout chunk (state=3): >>> # destroy __main__<<< 11792 1727096121.00337: stdout chunk (state=3): >>> # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder<<< 11792 1727096121.00347: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json<<< 11792 1727096121.00376: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale<<< 11792 1727096121.00398: stdout chunk (state=3): >>> # destroy pwd # destroy locale<<< 11792 1727096121.00412: stdout chunk (state=3): >>> # destroy signal # destroy fcntl # destroy select<<< 11792 1727096121.00427: stdout chunk (state=3): >>> # destroy _signal # destroy _posixsubprocess<<< 11792 1727096121.00434: stdout chunk (state=3): >>> # destroy syslog<<< 11792 1727096121.00470: stdout chunk (state=3): >>> # destroy uuid # destroy selectors <<< 11792 1727096121.00480: stdout chunk (state=3): >>># destroy errno<<< 11792 1727096121.00495: stdout chunk (state=3): >>> # destroy array <<< 11792 1727096121.00533: stdout chunk (state=3): >>># destroy datetime # destroy selinux<<< 11792 1727096121.00539: stdout chunk (state=3): >>> # destroy shutil<<< 11792 1727096121.00574: stdout chunk (state=3): >>> # destroy distro <<< 11792 1727096121.00580: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy json<<< 11792 1727096121.00651: stdout chunk (state=3): >>> # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux<<< 11792 1727096121.00674: stdout chunk (state=3): >>> <<< 11792 1727096121.00677: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian <<< 11792 1727096121.00703: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes <<< 11792 1727096121.00708: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon<<< 11792 1727096121.00733: stdout chunk (state=3): >>> # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 11792 1727096121.00739: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string<<< 11792 1727096121.00761: stdout chunk (state=3): >>> # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap<<< 11792 1727096121.00785: stdout chunk (state=3): >>> # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 11792 1727096121.00797: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing<<< 11792 1727096121.00821: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 11792 1727096121.00824: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 11792 1727096121.00849: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 11792 1727096121.00858: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 11792 1727096121.00878: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 11792 1727096121.00891: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 11792 1727096121.00909: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 11792 1727096121.00915: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 11792 1727096121.00940: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types<<< 11792 1727096121.00957: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 11792 1727096121.00969: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 11792 1727096121.00984: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8<<< 11792 1727096121.01003: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external<<< 11792 1727096121.01008: stdout chunk (state=3): >>> # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io<<< 11792 1727096121.01039: stdout chunk (state=3): >>> # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 11792 1727096121.01055: stdout chunk (state=3): >>> # cleanup[3] wiping builtins<<< 11792 1727096121.01062: stdout chunk (state=3): >>> # destroy selinux._selinux <<< 11792 1727096121.01129: stdout chunk (state=3): >>># destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11792 1727096121.01311: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11792 1727096121.01314: stdout chunk (state=3): >>># destroy _socket <<< 11792 1727096121.01344: stdout chunk (state=3): >>># destroy _collections <<< 11792 1727096121.01383: stdout chunk (state=3): >>># destroy platform <<< 11792 1727096121.01398: stdout chunk (state=3): >>># destroy _uuid # destroy stat<<< 11792 1727096121.01415: stdout chunk (state=3): >>> <<< 11792 1727096121.01418: stdout chunk (state=3): >>># destroy genericpath # destroy re._parser<<< 11792 1727096121.01421: stdout chunk (state=3): >>> # destroy tokenize<<< 11792 1727096121.01455: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib<<< 11792 1727096121.01460: stdout chunk (state=3): >>> # destroy copyreg<<< 11792 1727096121.01463: stdout chunk (state=3): >>> <<< 11792 1727096121.01497: stdout chunk (state=3): >>># destroy contextlib # destroy _typing<<< 11792 1727096121.01524: stdout chunk (state=3): >>> # destroy _tokenize<<< 11792 1727096121.01528: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse<<< 11792 1727096121.01533: stdout chunk (state=3): >>> <<< 11792 1727096121.01557: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 11792 1727096121.01561: stdout chunk (state=3): >>> # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 11792 1727096121.01591: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path <<< 11792 1727096121.01597: stdout chunk (state=3): >>># clear sys.modules <<< 11792 1727096121.01626: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 11792 1727096121.01719: stdout chunk (state=3): >>># destroy codecs<<< 11792 1727096121.01724: stdout chunk (state=3): >>> # destroy encodings.aliases <<< 11792 1727096121.01739: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs<<< 11792 1727096121.01757: stdout chunk (state=3): >>> # destroy io # destroy traceback # destroy warnings # destroy weakref<<< 11792 1727096121.01770: stdout chunk (state=3): >>> # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 11792 1727096121.01775: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time<<< 11792 1727096121.01813: stdout chunk (state=3): >>> # destroy _random # destroy _weakref<<< 11792 1727096121.01817: stdout chunk (state=3): >>> <<< 11792 1727096121.01842: stdout chunk (state=3): >>># destroy _hashlib<<< 11792 1727096121.01872: stdout chunk (state=3): >>> # destroy _operator<<< 11792 1727096121.01875: stdout chunk (state=3): >>> <<< 11792 1727096121.01879: stdout chunk (state=3): >>># destroy _string # destroy re<<< 11792 1727096121.01899: stdout chunk (state=3): >>> # destroy itertools <<< 11792 1727096121.01916: stdout chunk (state=3): >>># destroy _abc # destroy _sre <<< 11792 1727096121.01934: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 11792 1727096121.01959: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 11792 1727096121.01971: stdout chunk (state=3): >>> <<< 11792 1727096121.02423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096121.02460: stderr chunk (state=3): >>><<< 11792 1727096121.02463: stdout chunk (state=3): >>><<< 11792 1727096121.02532: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a6bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a6bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504aa3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504aa3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a69070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504ac37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504ac23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504ac0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504af8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504af8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504af8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504af8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504a66e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504af9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504af9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504afa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504b10740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504b11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504b12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504b132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504b12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504b13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504b134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504afa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048d3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048fc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048fc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048fc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048fd070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15048fda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048fc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048d1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048fee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048fdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504afac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15049271a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150494b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15049ac2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15049aea20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15049ac3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150496d2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15047ad3d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150494a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048ffd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f15047ad670> # zipimport: found 30 names in '/tmp/ansible_stat_payload_zcgh_k0l/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504803170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15047e2060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15047e11f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504801040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150482aae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150482a870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150482a180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150482abd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504803e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150482b860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150482b9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150482bef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150410dc70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150410f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504110290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504111430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504113f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041185c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041121e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150411bec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150411a990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150411a6f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150411ac60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504112660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504163ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1504165c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504165a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041681a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504166300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150416b950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504168350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150416ca10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150416cb60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150416cb30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041642c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041f43b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041f55e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150416eb40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150416fef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f150416e7b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f15041fd7c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041fe570> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041f5730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041fe210> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041ff620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f150400a090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1504007d10> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048867e0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15048724b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15041fc770> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f15040004d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11792 1727096121.03052: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096121.03056: _low_level_execute_command(): starting 11792 1727096121.03058: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096120.4608135-11943-127584484573635/ > /dev/null 2>&1 && sleep 0' 11792 1727096121.03217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096121.03224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096121.03237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096121.03291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096121.03294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096121.03296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096121.03346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096121.06034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096121.06063: stderr chunk (state=3): >>><<< 11792 1727096121.06066: stdout chunk (state=3): >>><<< 11792 1727096121.06082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096121.06088: handler run complete 11792 1727096121.06103: attempt loop complete, returning result 11792 1727096121.06108: _execute() done 11792 1727096121.06111: dumping result to json 11792 1727096121.06113: done dumping result, returning 11792 1727096121.06125: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0afff68d-5257-d9c7-3fc0-000000000028] 11792 1727096121.06127: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000028 11792 1727096121.06214: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000028 11792 1727096121.06217: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11792 1727096121.06309: no more pending results, returning what we have 11792 1727096121.06312: results queue empty 11792 1727096121.06313: checking for any_errors_fatal 11792 1727096121.06319: done checking for any_errors_fatal 11792 1727096121.06320: checking for max_fail_percentage 11792 1727096121.06321: done checking for max_fail_percentage 11792 1727096121.06322: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.06322: done checking to see if all hosts have failed 11792 1727096121.06323: getting the remaining hosts for this loop 11792 1727096121.06325: done getting the remaining hosts for this loop 11792 1727096121.06328: getting the next task for host managed_node2 11792 1727096121.06340: done getting next task for host managed_node2 11792 1727096121.06343: ^ task is: TASK: Set flag to indicate system is ostree 11792 1727096121.06345: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.06348: getting variables 11792 1727096121.06349: in VariableManager get_vars() 11792 1727096121.06380: Calling all_inventory to load vars for managed_node2 11792 1727096121.06383: Calling groups_inventory to load vars for managed_node2 11792 1727096121.06386: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.06397: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.06399: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.06402: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.06548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.06698: done with get_vars() 11792 1727096121.06707: done getting variables 11792 1727096121.06783: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 08:55:21 -0400 (0:00:00.680) 0:00:03.347 ****** 11792 1727096121.06806: entering _queue_task() for managed_node2/set_fact 11792 1727096121.06807: Creating lock for set_fact 11792 1727096121.07034: worker is 1 (out of 1 available) 11792 1727096121.07047: exiting _queue_task() for managed_node2/set_fact 11792 1727096121.07060: done queuing things up, now waiting for results queue to drain 11792 1727096121.07062: waiting for pending results... 11792 1727096121.07220: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 11792 1727096121.07285: in run() - task 0afff68d-5257-d9c7-3fc0-000000000029 11792 1727096121.07296: variable 'ansible_search_path' from source: unknown 11792 1727096121.07299: variable 'ansible_search_path' from source: unknown 11792 1727096121.07332: calling self._execute() 11792 1727096121.07390: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.07394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.07403: variable 'omit' from source: magic vars 11792 1727096121.07749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096121.07976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096121.08007: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096121.08031: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096121.08059: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096121.08124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096121.08141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096121.08163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096121.08183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096121.08276: Evaluated conditional (not __network_is_ostree is defined): True 11792 1727096121.08279: variable 'omit' from source: magic vars 11792 1727096121.08306: variable 'omit' from source: magic vars 11792 1727096121.08391: variable '__ostree_booted_stat' from source: set_fact 11792 1727096121.08430: variable 'omit' from source: magic vars 11792 1727096121.08450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096121.08474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096121.08490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096121.08503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096121.08513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096121.08537: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096121.08541: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.08543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.08611: Set connection var ansible_timeout to 10 11792 1727096121.08618: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096121.08631: Set connection var ansible_shell_executable to /bin/sh 11792 1727096121.08636: Set connection var ansible_pipelining to False 11792 1727096121.08638: Set connection var ansible_shell_type to sh 11792 1727096121.08640: Set connection var ansible_connection to ssh 11792 1727096121.08656: variable 'ansible_shell_executable' from source: unknown 11792 1727096121.08660: variable 'ansible_connection' from source: unknown 11792 1727096121.08662: variable 'ansible_module_compression' from source: unknown 11792 1727096121.08664: variable 'ansible_shell_type' from source: unknown 11792 1727096121.08667: variable 'ansible_shell_executable' from source: unknown 11792 1727096121.08671: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.08674: variable 'ansible_pipelining' from source: unknown 11792 1727096121.08676: variable 'ansible_timeout' from source: unknown 11792 1727096121.08680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.08752: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096121.08762: variable 'omit' from source: magic vars 11792 1727096121.08766: starting attempt loop 11792 1727096121.08771: running the handler 11792 1727096121.08780: handler run complete 11792 1727096121.08788: attempt loop complete, returning result 11792 1727096121.08790: _execute() done 11792 1727096121.08793: dumping result to json 11792 1727096121.08796: done dumping result, returning 11792 1727096121.08804: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0afff68d-5257-d9c7-3fc0-000000000029] 11792 1727096121.08806: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000029 11792 1727096121.08890: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000029 11792 1727096121.08893: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11792 1727096121.08964: no more pending results, returning what we have 11792 1727096121.08967: results queue empty 11792 1727096121.08969: checking for any_errors_fatal 11792 1727096121.08975: done checking for any_errors_fatal 11792 1727096121.08976: checking for max_fail_percentage 11792 1727096121.08978: done checking for max_fail_percentage 11792 1727096121.08978: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.08979: done checking to see if all hosts have failed 11792 1727096121.08980: getting the remaining hosts for this loop 11792 1727096121.08981: done getting the remaining hosts for this loop 11792 1727096121.08984: getting the next task for host managed_node2 11792 1727096121.08991: done getting next task for host managed_node2 11792 1727096121.08993: ^ task is: TASK: Fix CentOS6 Base repo 11792 1727096121.08995: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.08998: getting variables 11792 1727096121.08999: in VariableManager get_vars() 11792 1727096121.09028: Calling all_inventory to load vars for managed_node2 11792 1727096121.09030: Calling groups_inventory to load vars for managed_node2 11792 1727096121.09033: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.09041: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.09044: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.09053: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.09215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.09330: done with get_vars() 11792 1727096121.09337: done getting variables 11792 1727096121.09430: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 08:55:21 -0400 (0:00:00.026) 0:00:03.373 ****** 11792 1727096121.09453: entering _queue_task() for managed_node2/copy 11792 1727096121.09672: worker is 1 (out of 1 available) 11792 1727096121.09684: exiting _queue_task() for managed_node2/copy 11792 1727096121.09696: done queuing things up, now waiting for results queue to drain 11792 1727096121.09697: waiting for pending results... 11792 1727096121.09842: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 11792 1727096121.09905: in run() - task 0afff68d-5257-d9c7-3fc0-00000000002b 11792 1727096121.09918: variable 'ansible_search_path' from source: unknown 11792 1727096121.09922: variable 'ansible_search_path' from source: unknown 11792 1727096121.09956: calling self._execute() 11792 1727096121.10010: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.10015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.10024: variable 'omit' from source: magic vars 11792 1727096121.10371: variable 'ansible_distribution' from source: facts 11792 1727096121.10387: Evaluated conditional (ansible_distribution == 'CentOS'): True 11792 1727096121.10470: variable 'ansible_distribution_major_version' from source: facts 11792 1727096121.10474: Evaluated conditional (ansible_distribution_major_version == '6'): False 11792 1727096121.10477: when evaluation is False, skipping this task 11792 1727096121.10479: _execute() done 11792 1727096121.10484: dumping result to json 11792 1727096121.10486: done dumping result, returning 11792 1727096121.10493: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0afff68d-5257-d9c7-3fc0-00000000002b] 11792 1727096121.10496: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000002b 11792 1727096121.10590: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000002b skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11792 1727096121.10655: no more pending results, returning what we have 11792 1727096121.10659: results queue empty 11792 1727096121.10660: checking for any_errors_fatal 11792 1727096121.10665: done checking for any_errors_fatal 11792 1727096121.10665: checking for max_fail_percentage 11792 1727096121.10669: done checking for max_fail_percentage 11792 1727096121.10669: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.10670: done checking to see if all hosts have failed 11792 1727096121.10671: getting the remaining hosts for this loop 11792 1727096121.10673: done getting the remaining hosts for this loop 11792 1727096121.10676: getting the next task for host managed_node2 11792 1727096121.10681: done getting next task for host managed_node2 11792 1727096121.10683: ^ task is: TASK: Include the task 'enable_epel.yml' 11792 1727096121.10686: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.10689: getting variables 11792 1727096121.10690: in VariableManager get_vars() 11792 1727096121.10714: Calling all_inventory to load vars for managed_node2 11792 1727096121.10716: Calling groups_inventory to load vars for managed_node2 11792 1727096121.10718: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.10728: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.10730: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.10732: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.10862: WORKER PROCESS EXITING 11792 1727096121.10875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.10995: done with get_vars() 11792 1727096121.11003: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 08:55:21 -0400 (0:00:00.016) 0:00:03.389 ****** 11792 1727096121.11069: entering _queue_task() for managed_node2/include_tasks 11792 1727096121.11284: worker is 1 (out of 1 available) 11792 1727096121.11297: exiting _queue_task() for managed_node2/include_tasks 11792 1727096121.11309: done queuing things up, now waiting for results queue to drain 11792 1727096121.11310: waiting for pending results... 11792 1727096121.11455: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 11792 1727096121.11517: in run() - task 0afff68d-5257-d9c7-3fc0-00000000002c 11792 1727096121.11528: variable 'ansible_search_path' from source: unknown 11792 1727096121.11533: variable 'ansible_search_path' from source: unknown 11792 1727096121.11566: calling self._execute() 11792 1727096121.11619: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.11623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.11632: variable 'omit' from source: magic vars 11792 1727096121.12039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096121.13629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096121.13676: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096121.13704: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096121.13742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096121.13764: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096121.13824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096121.13847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096121.13869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096121.13896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096121.13906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096121.13991: variable '__network_is_ostree' from source: set_fact 11792 1727096121.14004: Evaluated conditional (not __network_is_ostree | d(false)): True 11792 1727096121.14010: _execute() done 11792 1727096121.14012: dumping result to json 11792 1727096121.14015: done dumping result, returning 11792 1727096121.14021: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-d9c7-3fc0-00000000002c] 11792 1727096121.14024: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000002c 11792 1727096121.14111: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000002c 11792 1727096121.14114: WORKER PROCESS EXITING 11792 1727096121.14140: no more pending results, returning what we have 11792 1727096121.14145: in VariableManager get_vars() 11792 1727096121.14181: Calling all_inventory to load vars for managed_node2 11792 1727096121.14183: Calling groups_inventory to load vars for managed_node2 11792 1727096121.14186: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.14197: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.14199: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.14202: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.14400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.14513: done with get_vars() 11792 1727096121.14518: variable 'ansible_search_path' from source: unknown 11792 1727096121.14519: variable 'ansible_search_path' from source: unknown 11792 1727096121.14544: we have included files to process 11792 1727096121.14544: generating all_blocks data 11792 1727096121.14545: done generating all_blocks data 11792 1727096121.14549: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11792 1727096121.14552: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11792 1727096121.14553: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11792 1727096121.15032: done processing included file 11792 1727096121.15034: iterating over new_blocks loaded from include file 11792 1727096121.15036: in VariableManager get_vars() 11792 1727096121.15046: done with get_vars() 11792 1727096121.15047: filtering new block on tags 11792 1727096121.15062: done filtering new block on tags 11792 1727096121.15064: in VariableManager get_vars() 11792 1727096121.15072: done with get_vars() 11792 1727096121.15073: filtering new block on tags 11792 1727096121.15080: done filtering new block on tags 11792 1727096121.15081: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 11792 1727096121.15085: extending task lists for all hosts with included blocks 11792 1727096121.15143: done extending task lists 11792 1727096121.15144: done processing included files 11792 1727096121.15145: results queue empty 11792 1727096121.15145: checking for any_errors_fatal 11792 1727096121.15148: done checking for any_errors_fatal 11792 1727096121.15149: checking for max_fail_percentage 11792 1727096121.15150: done checking for max_fail_percentage 11792 1727096121.15150: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.15151: done checking to see if all hosts have failed 11792 1727096121.15152: getting the remaining hosts for this loop 11792 1727096121.15152: done getting the remaining hosts for this loop 11792 1727096121.15154: getting the next task for host managed_node2 11792 1727096121.15157: done getting next task for host managed_node2 11792 1727096121.15158: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11792 1727096121.15160: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.15161: getting variables 11792 1727096121.15162: in VariableManager get_vars() 11792 1727096121.15169: Calling all_inventory to load vars for managed_node2 11792 1727096121.15185: Calling groups_inventory to load vars for managed_node2 11792 1727096121.15187: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.15191: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.15197: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.15199: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.15281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.15393: done with get_vars() 11792 1727096121.15399: done getting variables 11792 1727096121.15446: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11792 1727096121.15583: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 08:55:21 -0400 (0:00:00.045) 0:00:03.435 ****** 11792 1727096121.15616: entering _queue_task() for managed_node2/command 11792 1727096121.15617: Creating lock for command 11792 1727096121.15851: worker is 1 (out of 1 available) 11792 1727096121.15863: exiting _queue_task() for managed_node2/command 11792 1727096121.15876: done queuing things up, now waiting for results queue to drain 11792 1727096121.15877: waiting for pending results... 11792 1727096121.16029: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 11792 1727096121.16102: in run() - task 0afff68d-5257-d9c7-3fc0-000000000046 11792 1727096121.16110: variable 'ansible_search_path' from source: unknown 11792 1727096121.16114: variable 'ansible_search_path' from source: unknown 11792 1727096121.16142: calling self._execute() 11792 1727096121.16199: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.16203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.16213: variable 'omit' from source: magic vars 11792 1727096121.16483: variable 'ansible_distribution' from source: facts 11792 1727096121.16491: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11792 1727096121.16578: variable 'ansible_distribution_major_version' from source: facts 11792 1727096121.16582: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11792 1727096121.16587: when evaluation is False, skipping this task 11792 1727096121.16589: _execute() done 11792 1727096121.16592: dumping result to json 11792 1727096121.16594: done dumping result, returning 11792 1727096121.16605: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [0afff68d-5257-d9c7-3fc0-000000000046] 11792 1727096121.16608: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000046 11792 1727096121.16695: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000046 11792 1727096121.16698: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11792 1727096121.16749: no more pending results, returning what we have 11792 1727096121.16753: results queue empty 11792 1727096121.16754: checking for any_errors_fatal 11792 1727096121.16755: done checking for any_errors_fatal 11792 1727096121.16755: checking for max_fail_percentage 11792 1727096121.16757: done checking for max_fail_percentage 11792 1727096121.16757: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.16758: done checking to see if all hosts have failed 11792 1727096121.16759: getting the remaining hosts for this loop 11792 1727096121.16760: done getting the remaining hosts for this loop 11792 1727096121.16764: getting the next task for host managed_node2 11792 1727096121.16772: done getting next task for host managed_node2 11792 1727096121.16774: ^ task is: TASK: Install yum-utils package 11792 1727096121.16778: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.16782: getting variables 11792 1727096121.16783: in VariableManager get_vars() 11792 1727096121.16811: Calling all_inventory to load vars for managed_node2 11792 1727096121.16813: Calling groups_inventory to load vars for managed_node2 11792 1727096121.16816: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.16826: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.16828: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.16831: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.17098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.17209: done with get_vars() 11792 1727096121.17216: done getting variables 11792 1727096121.17292: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 08:55:21 -0400 (0:00:00.016) 0:00:03.452 ****** 11792 1727096121.17314: entering _queue_task() for managed_node2/package 11792 1727096121.17315: Creating lock for package 11792 1727096121.17558: worker is 1 (out of 1 available) 11792 1727096121.17572: exiting _queue_task() for managed_node2/package 11792 1727096121.17584: done queuing things up, now waiting for results queue to drain 11792 1727096121.17586: waiting for pending results... 11792 1727096121.17726: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 11792 1727096121.17800: in run() - task 0afff68d-5257-d9c7-3fc0-000000000047 11792 1727096121.17816: variable 'ansible_search_path' from source: unknown 11792 1727096121.17819: variable 'ansible_search_path' from source: unknown 11792 1727096121.17844: calling self._execute() 11792 1727096121.17904: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.17908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.17922: variable 'omit' from source: magic vars 11792 1727096121.18189: variable 'ansible_distribution' from source: facts 11792 1727096121.18199: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11792 1727096121.18287: variable 'ansible_distribution_major_version' from source: facts 11792 1727096121.18291: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11792 1727096121.18294: when evaluation is False, skipping this task 11792 1727096121.18298: _execute() done 11792 1727096121.18300: dumping result to json 11792 1727096121.18304: done dumping result, returning 11792 1727096121.18311: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0afff68d-5257-d9c7-3fc0-000000000047] 11792 1727096121.18314: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000047 11792 1727096121.18400: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000047 11792 1727096121.18403: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11792 1727096121.18448: no more pending results, returning what we have 11792 1727096121.18454: results queue empty 11792 1727096121.18455: checking for any_errors_fatal 11792 1727096121.18462: done checking for any_errors_fatal 11792 1727096121.18463: checking for max_fail_percentage 11792 1727096121.18465: done checking for max_fail_percentage 11792 1727096121.18465: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.18466: done checking to see if all hosts have failed 11792 1727096121.18467: getting the remaining hosts for this loop 11792 1727096121.18470: done getting the remaining hosts for this loop 11792 1727096121.18474: getting the next task for host managed_node2 11792 1727096121.18479: done getting next task for host managed_node2 11792 1727096121.18481: ^ task is: TASK: Enable EPEL 7 11792 1727096121.18485: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.18488: getting variables 11792 1727096121.18489: in VariableManager get_vars() 11792 1727096121.18515: Calling all_inventory to load vars for managed_node2 11792 1727096121.18517: Calling groups_inventory to load vars for managed_node2 11792 1727096121.18519: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.18529: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.18531: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.18533: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.18662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.18782: done with get_vars() 11792 1727096121.18792: done getting variables 11792 1727096121.18832: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 08:55:21 -0400 (0:00:00.015) 0:00:03.467 ****** 11792 1727096121.18852: entering _queue_task() for managed_node2/command 11792 1727096121.19046: worker is 1 (out of 1 available) 11792 1727096121.19058: exiting _queue_task() for managed_node2/command 11792 1727096121.19071: done queuing things up, now waiting for results queue to drain 11792 1727096121.19073: waiting for pending results... 11792 1727096121.19383: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 11792 1727096121.19388: in run() - task 0afff68d-5257-d9c7-3fc0-000000000048 11792 1727096121.19392: variable 'ansible_search_path' from source: unknown 11792 1727096121.19394: variable 'ansible_search_path' from source: unknown 11792 1727096121.19398: calling self._execute() 11792 1727096121.19477: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.19495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.19524: variable 'omit' from source: magic vars 11792 1727096121.19939: variable 'ansible_distribution' from source: facts 11792 1727096121.19956: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11792 1727096121.20137: variable 'ansible_distribution_major_version' from source: facts 11792 1727096121.20174: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11792 1727096121.20183: when evaluation is False, skipping this task 11792 1727096121.20195: _execute() done 11792 1727096121.20199: dumping result to json 11792 1727096121.20201: done dumping result, returning 11792 1727096121.20207: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0afff68d-5257-d9c7-3fc0-000000000048] 11792 1727096121.20209: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000048 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11792 1727096121.20377: no more pending results, returning what we have 11792 1727096121.20381: results queue empty 11792 1727096121.20382: checking for any_errors_fatal 11792 1727096121.20387: done checking for any_errors_fatal 11792 1727096121.20388: checking for max_fail_percentage 11792 1727096121.20389: done checking for max_fail_percentage 11792 1727096121.20390: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.20390: done checking to see if all hosts have failed 11792 1727096121.20391: getting the remaining hosts for this loop 11792 1727096121.20393: done getting the remaining hosts for this loop 11792 1727096121.20396: getting the next task for host managed_node2 11792 1727096121.20402: done getting next task for host managed_node2 11792 1727096121.20404: ^ task is: TASK: Enable EPEL 8 11792 1727096121.20408: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.20413: getting variables 11792 1727096121.20414: in VariableManager get_vars() 11792 1727096121.20446: Calling all_inventory to load vars for managed_node2 11792 1727096121.20449: Calling groups_inventory to load vars for managed_node2 11792 1727096121.20452: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.20462: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.20464: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.20466: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.20634: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000048 11792 1727096121.20643: WORKER PROCESS EXITING 11792 1727096121.20655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.20772: done with get_vars() 11792 1727096121.20779: done getting variables 11792 1727096121.20819: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 08:55:21 -0400 (0:00:00.019) 0:00:03.487 ****** 11792 1727096121.20839: entering _queue_task() for managed_node2/command 11792 1727096121.21051: worker is 1 (out of 1 available) 11792 1727096121.21063: exiting _queue_task() for managed_node2/command 11792 1727096121.21076: done queuing things up, now waiting for results queue to drain 11792 1727096121.21078: waiting for pending results... 11792 1727096121.21226: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 11792 1727096121.21293: in run() - task 0afff68d-5257-d9c7-3fc0-000000000049 11792 1727096121.21309: variable 'ansible_search_path' from source: unknown 11792 1727096121.21312: variable 'ansible_search_path' from source: unknown 11792 1727096121.21338: calling self._execute() 11792 1727096121.21394: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.21398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.21408: variable 'omit' from source: magic vars 11792 1727096121.21685: variable 'ansible_distribution' from source: facts 11792 1727096121.21695: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11792 1727096121.21784: variable 'ansible_distribution_major_version' from source: facts 11792 1727096121.21788: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11792 1727096121.21791: when evaluation is False, skipping this task 11792 1727096121.21794: _execute() done 11792 1727096121.21796: dumping result to json 11792 1727096121.21801: done dumping result, returning 11792 1727096121.21807: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0afff68d-5257-d9c7-3fc0-000000000049] 11792 1727096121.21812: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000049 11792 1727096121.21894: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000049 11792 1727096121.21897: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11792 1727096121.21939: no more pending results, returning what we have 11792 1727096121.21942: results queue empty 11792 1727096121.21943: checking for any_errors_fatal 11792 1727096121.21947: done checking for any_errors_fatal 11792 1727096121.21948: checking for max_fail_percentage 11792 1727096121.21950: done checking for max_fail_percentage 11792 1727096121.21950: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.21951: done checking to see if all hosts have failed 11792 1727096121.21952: getting the remaining hosts for this loop 11792 1727096121.21953: done getting the remaining hosts for this loop 11792 1727096121.21956: getting the next task for host managed_node2 11792 1727096121.21966: done getting next task for host managed_node2 11792 1727096121.21969: ^ task is: TASK: Enable EPEL 6 11792 1727096121.21973: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.21977: getting variables 11792 1727096121.21978: in VariableManager get_vars() 11792 1727096121.22003: Calling all_inventory to load vars for managed_node2 11792 1727096121.22006: Calling groups_inventory to load vars for managed_node2 11792 1727096121.22009: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.22018: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.22020: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.22022: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.22203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.22647: done with get_vars() 11792 1727096121.22658: done getting variables 11792 1727096121.22760: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 08:55:21 -0400 (0:00:00.019) 0:00:03.507 ****** 11792 1727096121.22793: entering _queue_task() for managed_node2/copy 11792 1727096121.23291: worker is 1 (out of 1 available) 11792 1727096121.23299: exiting _queue_task() for managed_node2/copy 11792 1727096121.23311: done queuing things up, now waiting for results queue to drain 11792 1727096121.23313: waiting for pending results... 11792 1727096121.23553: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 11792 1727096121.23557: in run() - task 0afff68d-5257-d9c7-3fc0-00000000004b 11792 1727096121.23560: variable 'ansible_search_path' from source: unknown 11792 1727096121.23562: variable 'ansible_search_path' from source: unknown 11792 1727096121.23566: calling self._execute() 11792 1727096121.23632: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.23653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.23671: variable 'omit' from source: magic vars 11792 1727096121.24058: variable 'ansible_distribution' from source: facts 11792 1727096121.24080: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11792 1727096121.24205: variable 'ansible_distribution_major_version' from source: facts 11792 1727096121.24216: Evaluated conditional (ansible_distribution_major_version == '6'): False 11792 1727096121.24223: when evaluation is False, skipping this task 11792 1727096121.24230: _execute() done 11792 1727096121.24238: dumping result to json 11792 1727096121.24245: done dumping result, returning 11792 1727096121.24305: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0afff68d-5257-d9c7-3fc0-00000000004b] 11792 1727096121.24308: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000004b 11792 1727096121.24382: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000004b 11792 1727096121.24385: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11792 1727096121.24457: no more pending results, returning what we have 11792 1727096121.24461: results queue empty 11792 1727096121.24462: checking for any_errors_fatal 11792 1727096121.24470: done checking for any_errors_fatal 11792 1727096121.24471: checking for max_fail_percentage 11792 1727096121.24473: done checking for max_fail_percentage 11792 1727096121.24474: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.24475: done checking to see if all hosts have failed 11792 1727096121.24475: getting the remaining hosts for this loop 11792 1727096121.24477: done getting the remaining hosts for this loop 11792 1727096121.24481: getting the next task for host managed_node2 11792 1727096121.24490: done getting next task for host managed_node2 11792 1727096121.24493: ^ task is: TASK: Set network provider to 'nm' 11792 1727096121.24495: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.24499: getting variables 11792 1727096121.24501: in VariableManager get_vars() 11792 1727096121.24533: Calling all_inventory to load vars for managed_node2 11792 1727096121.24536: Calling groups_inventory to load vars for managed_node2 11792 1727096121.24539: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.24555: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.24558: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.24561: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.25053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.25248: done with get_vars() 11792 1727096121.25261: done getting variables 11792 1727096121.25319: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:13 Monday 23 September 2024 08:55:21 -0400 (0:00:00.025) 0:00:03.532 ****** 11792 1727096121.25346: entering _queue_task() for managed_node2/set_fact 11792 1727096121.25647: worker is 1 (out of 1 available) 11792 1727096121.25662: exiting _queue_task() for managed_node2/set_fact 11792 1727096121.25877: done queuing things up, now waiting for results queue to drain 11792 1727096121.25880: waiting for pending results... 11792 1727096121.26010: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 11792 1727096121.26030: in run() - task 0afff68d-5257-d9c7-3fc0-000000000007 11792 1727096121.26055: variable 'ansible_search_path' from source: unknown 11792 1727096121.26213: calling self._execute() 11792 1727096121.26216: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.26218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.26220: variable 'omit' from source: magic vars 11792 1727096121.26307: variable 'omit' from source: magic vars 11792 1727096121.26379: variable 'omit' from source: magic vars 11792 1727096121.26412: variable 'omit' from source: magic vars 11792 1727096121.26463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096121.26506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096121.26534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096121.26561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096121.26580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096121.26865: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096121.26870: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.26873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.26876: Set connection var ansible_timeout to 10 11792 1727096121.26878: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096121.26880: Set connection var ansible_shell_executable to /bin/sh 11792 1727096121.26882: Set connection var ansible_pipelining to False 11792 1727096121.26884: Set connection var ansible_shell_type to sh 11792 1727096121.26975: Set connection var ansible_connection to ssh 11792 1727096121.27008: variable 'ansible_shell_executable' from source: unknown 11792 1727096121.27016: variable 'ansible_connection' from source: unknown 11792 1727096121.27023: variable 'ansible_module_compression' from source: unknown 11792 1727096121.27031: variable 'ansible_shell_type' from source: unknown 11792 1727096121.27037: variable 'ansible_shell_executable' from source: unknown 11792 1727096121.27043: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.27216: variable 'ansible_pipelining' from source: unknown 11792 1727096121.27220: variable 'ansible_timeout' from source: unknown 11792 1727096121.27222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.27446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096121.27466: variable 'omit' from source: magic vars 11792 1727096121.27480: starting attempt loop 11792 1727096121.27488: running the handler 11792 1727096121.27505: handler run complete 11792 1727096121.27521: attempt loop complete, returning result 11792 1727096121.27546: _execute() done 11792 1727096121.27557: dumping result to json 11792 1727096121.27774: done dumping result, returning 11792 1727096121.27777: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0afff68d-5257-d9c7-3fc0-000000000007] 11792 1727096121.27780: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000007 11792 1727096121.27849: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000007 11792 1727096121.27855: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11792 1727096121.27935: no more pending results, returning what we have 11792 1727096121.27938: results queue empty 11792 1727096121.27939: checking for any_errors_fatal 11792 1727096121.27947: done checking for any_errors_fatal 11792 1727096121.27956: checking for max_fail_percentage 11792 1727096121.27959: done checking for max_fail_percentage 11792 1727096121.27960: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.27960: done checking to see if all hosts have failed 11792 1727096121.27981: getting the remaining hosts for this loop 11792 1727096121.27984: done getting the remaining hosts for this loop 11792 1727096121.27991: getting the next task for host managed_node2 11792 1727096121.28017: done getting next task for host managed_node2 11792 1727096121.28019: ^ task is: TASK: meta (flush_handlers) 11792 1727096121.28022: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.28028: getting variables 11792 1727096121.28029: in VariableManager get_vars() 11792 1727096121.28065: Calling all_inventory to load vars for managed_node2 11792 1727096121.28070: Calling groups_inventory to load vars for managed_node2 11792 1727096121.28074: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.28087: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.28091: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.28094: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.28493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.28683: done with get_vars() 11792 1727096121.28695: done getting variables 11792 1727096121.28767: in VariableManager get_vars() 11792 1727096121.28779: Calling all_inventory to load vars for managed_node2 11792 1727096121.28781: Calling groups_inventory to load vars for managed_node2 11792 1727096121.28784: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.28789: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.28791: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.28794: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.28964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.29144: done with get_vars() 11792 1727096121.29161: done queuing things up, now waiting for results queue to drain 11792 1727096121.29163: results queue empty 11792 1727096121.29164: checking for any_errors_fatal 11792 1727096121.29168: done checking for any_errors_fatal 11792 1727096121.29169: checking for max_fail_percentage 11792 1727096121.29170: done checking for max_fail_percentage 11792 1727096121.29171: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.29172: done checking to see if all hosts have failed 11792 1727096121.29173: getting the remaining hosts for this loop 11792 1727096121.29173: done getting the remaining hosts for this loop 11792 1727096121.29176: getting the next task for host managed_node2 11792 1727096121.29181: done getting next task for host managed_node2 11792 1727096121.29182: ^ task is: TASK: meta (flush_handlers) 11792 1727096121.29184: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.29193: getting variables 11792 1727096121.29194: in VariableManager get_vars() 11792 1727096121.29202: Calling all_inventory to load vars for managed_node2 11792 1727096121.29204: Calling groups_inventory to load vars for managed_node2 11792 1727096121.29206: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.29211: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.29214: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.29217: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.29347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.29527: done with get_vars() 11792 1727096121.29535: done getting variables 11792 1727096121.29590: in VariableManager get_vars() 11792 1727096121.29599: Calling all_inventory to load vars for managed_node2 11792 1727096121.29601: Calling groups_inventory to load vars for managed_node2 11792 1727096121.29603: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.29608: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.29610: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.29613: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.29760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.29957: done with get_vars() 11792 1727096121.29972: done queuing things up, now waiting for results queue to drain 11792 1727096121.29975: results queue empty 11792 1727096121.29975: checking for any_errors_fatal 11792 1727096121.29977: done checking for any_errors_fatal 11792 1727096121.29977: checking for max_fail_percentage 11792 1727096121.29978: done checking for max_fail_percentage 11792 1727096121.29979: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.29979: done checking to see if all hosts have failed 11792 1727096121.29980: getting the remaining hosts for this loop 11792 1727096121.29981: done getting the remaining hosts for this loop 11792 1727096121.29984: getting the next task for host managed_node2 11792 1727096121.29987: done getting next task for host managed_node2 11792 1727096121.29988: ^ task is: None 11792 1727096121.29990: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.29991: done queuing things up, now waiting for results queue to drain 11792 1727096121.29991: results queue empty 11792 1727096121.29992: checking for any_errors_fatal 11792 1727096121.29993: done checking for any_errors_fatal 11792 1727096121.29993: checking for max_fail_percentage 11792 1727096121.29994: done checking for max_fail_percentage 11792 1727096121.29995: checking to see if all hosts have failed and the running result is not ok 11792 1727096121.29996: done checking to see if all hosts have failed 11792 1727096121.29998: getting the next task for host managed_node2 11792 1727096121.30000: done getting next task for host managed_node2 11792 1727096121.30001: ^ task is: None 11792 1727096121.30002: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.30057: in VariableManager get_vars() 11792 1727096121.30073: done with get_vars() 11792 1727096121.30079: in VariableManager get_vars() 11792 1727096121.30088: done with get_vars() 11792 1727096121.30092: variable 'omit' from source: magic vars 11792 1727096121.30122: in VariableManager get_vars() 11792 1727096121.30131: done with get_vars() 11792 1727096121.30152: variable 'omit' from source: magic vars PLAY [Play for testing bond options] ******************************************* 11792 1727096121.30388: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11792 1727096121.30413: getting the remaining hosts for this loop 11792 1727096121.30414: done getting the remaining hosts for this loop 11792 1727096121.30416: getting the next task for host managed_node2 11792 1727096121.30419: done getting next task for host managed_node2 11792 1727096121.30421: ^ task is: TASK: Gathering Facts 11792 1727096121.30422: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096121.30424: getting variables 11792 1727096121.30425: in VariableManager get_vars() 11792 1727096121.30432: Calling all_inventory to load vars for managed_node2 11792 1727096121.30434: Calling groups_inventory to load vars for managed_node2 11792 1727096121.30436: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096121.30441: Calling all_plugins_play to load vars for managed_node2 11792 1727096121.30457: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096121.30461: Calling groups_plugins_play to load vars for managed_node2 11792 1727096121.30597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096121.30768: done with get_vars() 11792 1727096121.30777: done getting variables 11792 1727096121.30816: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 Monday 23 September 2024 08:55:21 -0400 (0:00:00.054) 0:00:03.587 ****** 11792 1727096121.30838: entering _queue_task() for managed_node2/gather_facts 11792 1727096121.31135: worker is 1 (out of 1 available) 11792 1727096121.31144: exiting _queue_task() for managed_node2/gather_facts 11792 1727096121.31160: done queuing things up, now waiting for results queue to drain 11792 1727096121.31162: waiting for pending results... 11792 1727096121.31403: running TaskExecutor() for managed_node2/TASK: Gathering Facts 11792 1727096121.31573: in run() - task 0afff68d-5257-d9c7-3fc0-000000000071 11792 1727096121.31577: variable 'ansible_search_path' from source: unknown 11792 1727096121.31581: calling self._execute() 11792 1727096121.31640: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.31655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.31671: variable 'omit' from source: magic vars 11792 1727096121.32131: variable 'ansible_distribution_major_version' from source: facts 11792 1727096121.32148: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096121.32161: variable 'omit' from source: magic vars 11792 1727096121.32188: variable 'omit' from source: magic vars 11792 1727096121.32226: variable 'omit' from source: magic vars 11792 1727096121.32356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096121.32359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096121.32362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096121.32364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096121.32380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096121.32411: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096121.32418: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.32424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.32521: Set connection var ansible_timeout to 10 11792 1727096121.32534: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096121.32547: Set connection var ansible_shell_executable to /bin/sh 11792 1727096121.32559: Set connection var ansible_pipelining to False 11792 1727096121.32566: Set connection var ansible_shell_type to sh 11792 1727096121.32574: Set connection var ansible_connection to ssh 11792 1727096121.32604: variable 'ansible_shell_executable' from source: unknown 11792 1727096121.32611: variable 'ansible_connection' from source: unknown 11792 1727096121.32617: variable 'ansible_module_compression' from source: unknown 11792 1727096121.32772: variable 'ansible_shell_type' from source: unknown 11792 1727096121.32776: variable 'ansible_shell_executable' from source: unknown 11792 1727096121.32778: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096121.32780: variable 'ansible_pipelining' from source: unknown 11792 1727096121.32782: variable 'ansible_timeout' from source: unknown 11792 1727096121.32784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096121.32830: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096121.32846: variable 'omit' from source: magic vars 11792 1727096121.32860: starting attempt loop 11792 1727096121.32867: running the handler 11792 1727096121.32888: variable 'ansible_facts' from source: unknown 11792 1727096121.32917: _low_level_execute_command(): starting 11792 1727096121.32929: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096121.33642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096121.33773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096121.33795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096121.33878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096121.36345: stdout chunk (state=3): >>>/root <<< 11792 1727096121.36543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096121.36777: stderr chunk (state=3): >>><<< 11792 1727096121.36781: stdout chunk (state=3): >>><<< 11792 1727096121.36785: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096121.36787: _low_level_execute_command(): starting 11792 1727096121.36790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366 `" && echo ansible-tmp-1727096121.3669033-11989-245885793945366="` echo /root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366 `" ) && sleep 0' 11792 1727096121.38021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096121.38109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096121.38186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096121.38210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096121.38238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096121.38398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096121.41206: stdout chunk (state=3): >>>ansible-tmp-1727096121.3669033-11989-245885793945366=/root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366 <<< 11792 1727096121.41441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096121.41473: stderr chunk (state=3): >>><<< 11792 1727096121.41476: stdout chunk (state=3): >>><<< 11792 1727096121.41551: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096121.3669033-11989-245885793945366=/root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096121.41555: variable 'ansible_module_compression' from source: unknown 11792 1727096121.41775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11792 1727096121.41778: variable 'ansible_facts' from source: unknown 11792 1727096121.42252: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/AnsiballZ_setup.py 11792 1727096121.42491: Sending initial data 11792 1727096121.42504: Sent initial data (154 bytes) 11792 1727096121.43840: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096121.43876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096121.43896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096121.43941: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096121.44012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096121.44043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096121.44069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096121.44180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096121.46476: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096121.46505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096121.46566: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp_9t0ruia /root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/AnsiballZ_setup.py <<< 11792 1727096121.46577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/AnsiballZ_setup.py" <<< 11792 1727096121.46642: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp_9t0ruia" to remote "/root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/AnsiballZ_setup.py" <<< 11792 1727096121.48751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096121.48764: stdout chunk (state=3): >>><<< 11792 1727096121.48780: stderr chunk (state=3): >>><<< 11792 1727096121.48812: done transferring module to remote 11792 1727096121.48826: _low_level_execute_command(): starting 11792 1727096121.48834: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/ /root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/AnsiballZ_setup.py && sleep 0' 11792 1727096121.49559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096121.49580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096121.49641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096121.49659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096121.49747: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096121.49763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096121.49800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096121.49814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096121.49896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096121.52579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096121.52590: stdout chunk (state=3): >>><<< 11792 1727096121.52601: stderr chunk (state=3): >>><<< 11792 1727096121.52624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096121.52673: _low_level_execute_command(): starting 11792 1727096121.52677: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/AnsiballZ_setup.py && sleep 0' 11792 1727096121.53288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096121.53304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096121.53330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096121.53348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096121.53442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096121.53470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096121.53497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096121.53512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096121.53611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096122.40115: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt<<< 11792 1727096122.40144: stdout chunk (state=3): >>>2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2937, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 594, "free": 2937}, "nocache": {"free": 3270, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 264, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794082816, "block_size": 4096, "block_total": 65519099, "block_available": 63914571, "block_used": 1604528, "inode_total": 131070960, "inode_available": 131029116, "inode_used": 41844, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.46240234375, "5m": 0.3916015625, "15m": 0.1787109375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "22", "epoch": "1727096122", "epoch_int": "1727096122", "date": "2024-09-23", "time": "08:55:22", "iso8601_micro": "2024-09-23T12:55:22.343805Z", "iso8601": "2024-09-23T12:55:22Z", "iso8601_basic": "20240923T085522343805", "iso8601_basic_short": "20240923T085522", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11792 1727096122.42989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096122.42994: stdout chunk (state=3): >>><<< 11792 1727096122.42997: stderr chunk (state=3): >>><<< 11792 1727096122.43123: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2937, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 594, "free": 2937}, "nocache": {"free": 3270, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 264, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794082816, "block_size": 4096, "block_total": 65519099, "block_available": 63914571, "block_used": 1604528, "inode_total": 131070960, "inode_available": 131029116, "inode_used": 41844, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.46240234375, "5m": 0.3916015625, "15m": 0.1787109375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "22", "epoch": "1727096122", "epoch_int": "1727096122", "date": "2024-09-23", "time": "08:55:22", "iso8601_micro": "2024-09-23T12:55:22.343805Z", "iso8601": "2024-09-23T12:55:22Z", "iso8601_basic": "20240923T085522343805", "iso8601_basic_short": "20240923T085522", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096122.43390: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096122.43436: _low_level_execute_command(): starting 11792 1727096122.43444: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096121.3669033-11989-245885793945366/ > /dev/null 2>&1 && sleep 0' 11792 1727096122.44119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096122.44123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096122.44218: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096122.44245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096122.44264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096122.44350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096122.46667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096122.46692: stdout chunk (state=3): >>><<< 11792 1727096122.46694: stderr chunk (state=3): >>><<< 11792 1727096122.46774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096122.46779: handler run complete 11792 1727096122.46796: variable 'ansible_facts' from source: unknown 11792 1727096122.46858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.47053: variable 'ansible_facts' from source: unknown 11792 1727096122.47106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.47183: attempt loop complete, returning result 11792 1727096122.47186: _execute() done 11792 1727096122.47189: dumping result to json 11792 1727096122.47207: done dumping result, returning 11792 1727096122.47214: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0afff68d-5257-d9c7-3fc0-000000000071] 11792 1727096122.47219: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000071 11792 1727096122.47506: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000071 11792 1727096122.47509: WORKER PROCESS EXITING ok: [managed_node2] 11792 1727096122.47700: no more pending results, returning what we have 11792 1727096122.47702: results queue empty 11792 1727096122.47703: checking for any_errors_fatal 11792 1727096122.47704: done checking for any_errors_fatal 11792 1727096122.47704: checking for max_fail_percentage 11792 1727096122.47705: done checking for max_fail_percentage 11792 1727096122.47706: checking to see if all hosts have failed and the running result is not ok 11792 1727096122.47706: done checking to see if all hosts have failed 11792 1727096122.47707: getting the remaining hosts for this loop 11792 1727096122.47708: done getting the remaining hosts for this loop 11792 1727096122.47710: getting the next task for host managed_node2 11792 1727096122.47714: done getting next task for host managed_node2 11792 1727096122.47715: ^ task is: TASK: meta (flush_handlers) 11792 1727096122.47716: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096122.47719: getting variables 11792 1727096122.47719: in VariableManager get_vars() 11792 1727096122.47737: Calling all_inventory to load vars for managed_node2 11792 1727096122.47739: Calling groups_inventory to load vars for managed_node2 11792 1727096122.47742: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.47751: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.47753: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.47755: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.47855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.47965: done with get_vars() 11792 1727096122.47974: done getting variables 11792 1727096122.48021: in VariableManager get_vars() 11792 1727096122.48026: Calling all_inventory to load vars for managed_node2 11792 1727096122.48028: Calling groups_inventory to load vars for managed_node2 11792 1727096122.48029: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.48032: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.48034: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.48035: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.48121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.48240: done with get_vars() 11792 1727096122.48249: done queuing things up, now waiting for results queue to drain 11792 1727096122.48252: results queue empty 11792 1727096122.48253: checking for any_errors_fatal 11792 1727096122.48255: done checking for any_errors_fatal 11792 1727096122.48256: checking for max_fail_percentage 11792 1727096122.48256: done checking for max_fail_percentage 11792 1727096122.48257: checking to see if all hosts have failed and the running result is not ok 11792 1727096122.48261: done checking to see if all hosts have failed 11792 1727096122.48262: getting the remaining hosts for this loop 11792 1727096122.48262: done getting the remaining hosts for this loop 11792 1727096122.48264: getting the next task for host managed_node2 11792 1727096122.48268: done getting next task for host managed_node2 11792 1727096122.48270: ^ task is: TASK: Show playbook name 11792 1727096122.48271: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096122.48272: getting variables 11792 1727096122.48273: in VariableManager get_vars() 11792 1727096122.48279: Calling all_inventory to load vars for managed_node2 11792 1727096122.48281: Calling groups_inventory to load vars for managed_node2 11792 1727096122.48283: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.48287: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.48288: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.48290: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.48366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.48473: done with get_vars() 11792 1727096122.48479: done getting variables 11792 1727096122.48538: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:32 Monday 23 September 2024 08:55:22 -0400 (0:00:01.177) 0:00:04.764 ****** 11792 1727096122.48559: entering _queue_task() for managed_node2/debug 11792 1727096122.48560: Creating lock for debug 11792 1727096122.48810: worker is 1 (out of 1 available) 11792 1727096122.48823: exiting _queue_task() for managed_node2/debug 11792 1727096122.48836: done queuing things up, now waiting for results queue to drain 11792 1727096122.48838: waiting for pending results... 11792 1727096122.48985: running TaskExecutor() for managed_node2/TASK: Show playbook name 11792 1727096122.49043: in run() - task 0afff68d-5257-d9c7-3fc0-00000000000b 11792 1727096122.49056: variable 'ansible_search_path' from source: unknown 11792 1727096122.49090: calling self._execute() 11792 1727096122.49140: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.49146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.49156: variable 'omit' from source: magic vars 11792 1727096122.49426: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.49437: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.49442: variable 'omit' from source: magic vars 11792 1727096122.49464: variable 'omit' from source: magic vars 11792 1727096122.49490: variable 'omit' from source: magic vars 11792 1727096122.49525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096122.49555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.49571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096122.49584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.49593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.49620: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.49624: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.49627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.49696: Set connection var ansible_timeout to 10 11792 1727096122.49703: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.49713: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.49718: Set connection var ansible_pipelining to False 11792 1727096122.49720: Set connection var ansible_shell_type to sh 11792 1727096122.49722: Set connection var ansible_connection to ssh 11792 1727096122.49740: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.49742: variable 'ansible_connection' from source: unknown 11792 1727096122.49745: variable 'ansible_module_compression' from source: unknown 11792 1727096122.49748: variable 'ansible_shell_type' from source: unknown 11792 1727096122.49753: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.49755: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.49757: variable 'ansible_pipelining' from source: unknown 11792 1727096122.49759: variable 'ansible_timeout' from source: unknown 11792 1727096122.49761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.49866: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.49876: variable 'omit' from source: magic vars 11792 1727096122.49881: starting attempt loop 11792 1727096122.49884: running the handler 11792 1727096122.49921: handler run complete 11792 1727096122.49940: attempt loop complete, returning result 11792 1727096122.49943: _execute() done 11792 1727096122.49946: dumping result to json 11792 1727096122.49948: done dumping result, returning 11792 1727096122.49954: done running TaskExecutor() for managed_node2/TASK: Show playbook name [0afff68d-5257-d9c7-3fc0-00000000000b] 11792 1727096122.49956: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000000b 11792 1727096122.50043: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000000b 11792 1727096122.50046: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: this is: playbooks/tests_bond_options.yml 11792 1727096122.50096: no more pending results, returning what we have 11792 1727096122.50099: results queue empty 11792 1727096122.50100: checking for any_errors_fatal 11792 1727096122.50101: done checking for any_errors_fatal 11792 1727096122.50102: checking for max_fail_percentage 11792 1727096122.50104: done checking for max_fail_percentage 11792 1727096122.50104: checking to see if all hosts have failed and the running result is not ok 11792 1727096122.50105: done checking to see if all hosts have failed 11792 1727096122.50105: getting the remaining hosts for this loop 11792 1727096122.50107: done getting the remaining hosts for this loop 11792 1727096122.50111: getting the next task for host managed_node2 11792 1727096122.50117: done getting next task for host managed_node2 11792 1727096122.50120: ^ task is: TASK: Include the task 'run_test.yml' 11792 1727096122.50122: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096122.50125: getting variables 11792 1727096122.50126: in VariableManager get_vars() 11792 1727096122.50158: Calling all_inventory to load vars for managed_node2 11792 1727096122.50161: Calling groups_inventory to load vars for managed_node2 11792 1727096122.50164: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.50183: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.50186: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.50189: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.50361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.50476: done with get_vars() 11792 1727096122.50484: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:42 Monday 23 September 2024 08:55:22 -0400 (0:00:00.019) 0:00:04.784 ****** 11792 1727096122.50547: entering _queue_task() for managed_node2/include_tasks 11792 1727096122.50774: worker is 1 (out of 1 available) 11792 1727096122.50787: exiting _queue_task() for managed_node2/include_tasks 11792 1727096122.50799: done queuing things up, now waiting for results queue to drain 11792 1727096122.50801: waiting for pending results... 11792 1727096122.50956: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 11792 1727096122.51011: in run() - task 0afff68d-5257-d9c7-3fc0-00000000000d 11792 1727096122.51023: variable 'ansible_search_path' from source: unknown 11792 1727096122.51057: calling self._execute() 11792 1727096122.51114: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.51119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.51127: variable 'omit' from source: magic vars 11792 1727096122.51398: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.51408: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.51413: _execute() done 11792 1727096122.51417: dumping result to json 11792 1727096122.51419: done dumping result, returning 11792 1727096122.51425: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0afff68d-5257-d9c7-3fc0-00000000000d] 11792 1727096122.51431: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000000d 11792 1727096122.51536: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000000d 11792 1727096122.51538: WORKER PROCESS EXITING 11792 1727096122.51574: no more pending results, returning what we have 11792 1727096122.51578: in VariableManager get_vars() 11792 1727096122.51611: Calling all_inventory to load vars for managed_node2 11792 1727096122.51614: Calling groups_inventory to load vars for managed_node2 11792 1727096122.51618: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.51630: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.51632: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.51635: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.51786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.51900: done with get_vars() 11792 1727096122.51905: variable 'ansible_search_path' from source: unknown 11792 1727096122.51916: we have included files to process 11792 1727096122.51917: generating all_blocks data 11792 1727096122.51918: done generating all_blocks data 11792 1727096122.51918: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11792 1727096122.51919: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11792 1727096122.51920: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11792 1727096122.52292: in VariableManager get_vars() 11792 1727096122.52302: done with get_vars() 11792 1727096122.52331: in VariableManager get_vars() 11792 1727096122.52341: done with get_vars() 11792 1727096122.52365: in VariableManager get_vars() 11792 1727096122.52376: done with get_vars() 11792 1727096122.52401: in VariableManager get_vars() 11792 1727096122.52409: done with get_vars() 11792 1727096122.52445: in VariableManager get_vars() 11792 1727096122.52456: done with get_vars() 11792 1727096122.52689: in VariableManager get_vars() 11792 1727096122.52699: done with get_vars() 11792 1727096122.52707: done processing included file 11792 1727096122.52708: iterating over new_blocks loaded from include file 11792 1727096122.52708: in VariableManager get_vars() 11792 1727096122.52714: done with get_vars() 11792 1727096122.52715: filtering new block on tags 11792 1727096122.52781: done filtering new block on tags 11792 1727096122.52784: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 11792 1727096122.52788: extending task lists for all hosts with included blocks 11792 1727096122.52809: done extending task lists 11792 1727096122.52810: done processing included files 11792 1727096122.52810: results queue empty 11792 1727096122.52811: checking for any_errors_fatal 11792 1727096122.52814: done checking for any_errors_fatal 11792 1727096122.52814: checking for max_fail_percentage 11792 1727096122.52815: done checking for max_fail_percentage 11792 1727096122.52816: checking to see if all hosts have failed and the running result is not ok 11792 1727096122.52816: done checking to see if all hosts have failed 11792 1727096122.52817: getting the remaining hosts for this loop 11792 1727096122.52817: done getting the remaining hosts for this loop 11792 1727096122.52819: getting the next task for host managed_node2 11792 1727096122.52821: done getting next task for host managed_node2 11792 1727096122.52823: ^ task is: TASK: TEST: {{ lsr_description }} 11792 1727096122.52824: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096122.52826: getting variables 11792 1727096122.52827: in VariableManager get_vars() 11792 1727096122.52832: Calling all_inventory to load vars for managed_node2 11792 1727096122.52834: Calling groups_inventory to load vars for managed_node2 11792 1727096122.52835: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.52839: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.52841: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.52842: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.52948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.53057: done with get_vars() 11792 1727096122.53063: done getting variables 11792 1727096122.53093: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096122.53180: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Monday 23 September 2024 08:55:22 -0400 (0:00:00.026) 0:00:04.811 ****** 11792 1727096122.53211: entering _queue_task() for managed_node2/debug 11792 1727096122.53454: worker is 1 (out of 1 available) 11792 1727096122.53466: exiting _queue_task() for managed_node2/debug 11792 1727096122.53481: done queuing things up, now waiting for results queue to drain 11792 1727096122.53482: waiting for pending results... 11792 1727096122.53637: running TaskExecutor() for managed_node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 11792 1727096122.53700: in run() - task 0afff68d-5257-d9c7-3fc0-000000000088 11792 1727096122.53711: variable 'ansible_search_path' from source: unknown 11792 1727096122.53715: variable 'ansible_search_path' from source: unknown 11792 1727096122.53747: calling self._execute() 11792 1727096122.53808: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.53813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.53821: variable 'omit' from source: magic vars 11792 1727096122.54092: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.54101: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.54107: variable 'omit' from source: magic vars 11792 1727096122.54133: variable 'omit' from source: magic vars 11792 1727096122.54209: variable 'lsr_description' from source: include params 11792 1727096122.54224: variable 'omit' from source: magic vars 11792 1727096122.54259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096122.54293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.54309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096122.54322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.54330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.54358: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.54361: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.54363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.54433: Set connection var ansible_timeout to 10 11792 1727096122.54440: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.54448: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.54455: Set connection var ansible_pipelining to False 11792 1727096122.54458: Set connection var ansible_shell_type to sh 11792 1727096122.54461: Set connection var ansible_connection to ssh 11792 1727096122.54482: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.54487: variable 'ansible_connection' from source: unknown 11792 1727096122.54489: variable 'ansible_module_compression' from source: unknown 11792 1727096122.54492: variable 'ansible_shell_type' from source: unknown 11792 1727096122.54494: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.54496: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.54498: variable 'ansible_pipelining' from source: unknown 11792 1727096122.54500: variable 'ansible_timeout' from source: unknown 11792 1727096122.54502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.54607: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.54620: variable 'omit' from source: magic vars 11792 1727096122.54623: starting attempt loop 11792 1727096122.54627: running the handler 11792 1727096122.54661: handler run complete 11792 1727096122.54673: attempt loop complete, returning result 11792 1727096122.54676: _execute() done 11792 1727096122.54679: dumping result to json 11792 1727096122.54681: done dumping result, returning 11792 1727096122.54689: done running TaskExecutor() for managed_node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [0afff68d-5257-d9c7-3fc0-000000000088] 11792 1727096122.54693: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000088 11792 1727096122.54779: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000088 11792 1727096122.54782: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 11792 1727096122.54837: no more pending results, returning what we have 11792 1727096122.54841: results queue empty 11792 1727096122.54842: checking for any_errors_fatal 11792 1727096122.54843: done checking for any_errors_fatal 11792 1727096122.54844: checking for max_fail_percentage 11792 1727096122.54845: done checking for max_fail_percentage 11792 1727096122.54846: checking to see if all hosts have failed and the running result is not ok 11792 1727096122.54846: done checking to see if all hosts have failed 11792 1727096122.54847: getting the remaining hosts for this loop 11792 1727096122.54849: done getting the remaining hosts for this loop 11792 1727096122.54852: getting the next task for host managed_node2 11792 1727096122.54859: done getting next task for host managed_node2 11792 1727096122.54861: ^ task is: TASK: Show item 11792 1727096122.54864: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096122.54869: getting variables 11792 1727096122.54871: in VariableManager get_vars() 11792 1727096122.54901: Calling all_inventory to load vars for managed_node2 11792 1727096122.54904: Calling groups_inventory to load vars for managed_node2 11792 1727096122.54907: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.54918: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.54920: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.54922: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.55064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.55185: done with get_vars() 11792 1727096122.55197: done getting variables 11792 1727096122.55237: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Monday 23 September 2024 08:55:22 -0400 (0:00:00.020) 0:00:04.831 ****** 11792 1727096122.55264: entering _queue_task() for managed_node2/debug 11792 1727096122.55487: worker is 1 (out of 1 available) 11792 1727096122.55502: exiting _queue_task() for managed_node2/debug 11792 1727096122.55514: done queuing things up, now waiting for results queue to drain 11792 1727096122.55516: waiting for pending results... 11792 1727096122.55664: running TaskExecutor() for managed_node2/TASK: Show item 11792 1727096122.55718: in run() - task 0afff68d-5257-d9c7-3fc0-000000000089 11792 1727096122.55729: variable 'ansible_search_path' from source: unknown 11792 1727096122.55733: variable 'ansible_search_path' from source: unknown 11792 1727096122.55780: variable 'omit' from source: magic vars 11792 1727096122.55874: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.55881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.55890: variable 'omit' from source: magic vars 11792 1727096122.56372: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.56380: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.56387: variable 'omit' from source: magic vars 11792 1727096122.56417: variable 'omit' from source: magic vars 11792 1727096122.56451: variable 'item' from source: unknown 11792 1727096122.56513: variable 'item' from source: unknown 11792 1727096122.56522: variable 'omit' from source: magic vars 11792 1727096122.56556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096122.56585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.56601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096122.56621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.56627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.56652: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.56656: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.56658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.56725: Set connection var ansible_timeout to 10 11792 1727096122.56734: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.56739: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.56747: Set connection var ansible_pipelining to False 11792 1727096122.56751: Set connection var ansible_shell_type to sh 11792 1727096122.56754: Set connection var ansible_connection to ssh 11792 1727096122.56768: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.56771: variable 'ansible_connection' from source: unknown 11792 1727096122.56774: variable 'ansible_module_compression' from source: unknown 11792 1727096122.56776: variable 'ansible_shell_type' from source: unknown 11792 1727096122.56778: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.56781: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.56785: variable 'ansible_pipelining' from source: unknown 11792 1727096122.56788: variable 'ansible_timeout' from source: unknown 11792 1727096122.56792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.56893: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.56901: variable 'omit' from source: magic vars 11792 1727096122.56906: starting attempt loop 11792 1727096122.56909: running the handler 11792 1727096122.56945: variable 'lsr_description' from source: include params 11792 1727096122.56998: variable 'lsr_description' from source: include params 11792 1727096122.57007: handler run complete 11792 1727096122.57021: attempt loop complete, returning result 11792 1727096122.57033: variable 'item' from source: unknown 11792 1727096122.57085: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 11792 1727096122.57234: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.57237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.57240: variable 'omit' from source: magic vars 11792 1727096122.57302: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.57307: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.57310: variable 'omit' from source: magic vars 11792 1727096122.57322: variable 'omit' from source: magic vars 11792 1727096122.57352: variable 'item' from source: unknown 11792 1727096122.57398: variable 'item' from source: unknown 11792 1727096122.57407: variable 'omit' from source: magic vars 11792 1727096122.57422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.57430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.57436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.57447: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.57453: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.57455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.57506: Set connection var ansible_timeout to 10 11792 1727096122.57509: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.57516: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.57520: Set connection var ansible_pipelining to False 11792 1727096122.57523: Set connection var ansible_shell_type to sh 11792 1727096122.57525: Set connection var ansible_connection to ssh 11792 1727096122.57539: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.57542: variable 'ansible_connection' from source: unknown 11792 1727096122.57544: variable 'ansible_module_compression' from source: unknown 11792 1727096122.57546: variable 'ansible_shell_type' from source: unknown 11792 1727096122.57549: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.57554: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.57556: variable 'ansible_pipelining' from source: unknown 11792 1727096122.57558: variable 'ansible_timeout' from source: unknown 11792 1727096122.57561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.57626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.57634: variable 'omit' from source: magic vars 11792 1727096122.57637: starting attempt loop 11792 1727096122.57640: running the handler 11792 1727096122.57658: variable 'lsr_setup' from source: include params 11792 1727096122.57707: variable 'lsr_setup' from source: include params 11792 1727096122.57742: handler run complete 11792 1727096122.57756: attempt loop complete, returning result 11792 1727096122.57766: variable 'item' from source: unknown 11792 1727096122.57812: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 11792 1727096122.57898: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.57901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.57909: variable 'omit' from source: magic vars 11792 1727096122.58011: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.58015: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.58017: variable 'omit' from source: magic vars 11792 1727096122.58029: variable 'omit' from source: magic vars 11792 1727096122.58062: variable 'item' from source: unknown 11792 1727096122.58105: variable 'item' from source: unknown 11792 1727096122.58116: variable 'omit' from source: magic vars 11792 1727096122.58131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.58138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.58143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.58159: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.58162: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.58165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.58210: Set connection var ansible_timeout to 10 11792 1727096122.58215: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.58223: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.58228: Set connection var ansible_pipelining to False 11792 1727096122.58231: Set connection var ansible_shell_type to sh 11792 1727096122.58233: Set connection var ansible_connection to ssh 11792 1727096122.58248: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.58250: variable 'ansible_connection' from source: unknown 11792 1727096122.58255: variable 'ansible_module_compression' from source: unknown 11792 1727096122.58260: variable 'ansible_shell_type' from source: unknown 11792 1727096122.58262: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.58264: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.58266: variable 'ansible_pipelining' from source: unknown 11792 1727096122.58278: variable 'ansible_timeout' from source: unknown 11792 1727096122.58280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.58334: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.58340: variable 'omit' from source: magic vars 11792 1727096122.58344: starting attempt loop 11792 1727096122.58347: running the handler 11792 1727096122.58365: variable 'lsr_test' from source: include params 11792 1727096122.58412: variable 'lsr_test' from source: include params 11792 1727096122.58425: handler run complete 11792 1727096122.58435: attempt loop complete, returning result 11792 1727096122.58446: variable 'item' from source: unknown 11792 1727096122.58496: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile.yml" ] } 11792 1727096122.58575: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.58578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.58581: variable 'omit' from source: magic vars 11792 1727096122.58680: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.58684: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.58688: variable 'omit' from source: magic vars 11792 1727096122.58700: variable 'omit' from source: magic vars 11792 1727096122.58729: variable 'item' from source: unknown 11792 1727096122.58775: variable 'item' from source: unknown 11792 1727096122.58785: variable 'omit' from source: magic vars 11792 1727096122.58800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.58806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.58813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.58824: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.58827: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.58829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.58877: Set connection var ansible_timeout to 10 11792 1727096122.58883: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.58890: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.58895: Set connection var ansible_pipelining to False 11792 1727096122.58899: Set connection var ansible_shell_type to sh 11792 1727096122.58901: Set connection var ansible_connection to ssh 11792 1727096122.58916: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.58919: variable 'ansible_connection' from source: unknown 11792 1727096122.58921: variable 'ansible_module_compression' from source: unknown 11792 1727096122.58925: variable 'ansible_shell_type' from source: unknown 11792 1727096122.58928: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.58930: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.58932: variable 'ansible_pipelining' from source: unknown 11792 1727096122.58934: variable 'ansible_timeout' from source: unknown 11792 1727096122.58944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.59003: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.59009: variable 'omit' from source: magic vars 11792 1727096122.59012: starting attempt loop 11792 1727096122.59015: running the handler 11792 1727096122.59032: variable 'lsr_assert' from source: include params 11792 1727096122.59083: variable 'lsr_assert' from source: include params 11792 1727096122.59097: handler run complete 11792 1727096122.59107: attempt loop complete, returning result 11792 1727096122.59118: variable 'item' from source: unknown 11792 1727096122.59165: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_controller_device_present.yml", "tasks/assert_bond_port_profile_present.yml", "tasks/assert_bond_options.yml" ] } 11792 1727096122.59249: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.59252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.59262: variable 'omit' from source: magic vars 11792 1727096122.59351: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.59358: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.59361: variable 'omit' from source: magic vars 11792 1727096122.59376: variable 'omit' from source: magic vars 11792 1727096122.59403: variable 'item' from source: unknown 11792 1727096122.59445: variable 'item' from source: unknown 11792 1727096122.59458: variable 'omit' from source: magic vars 11792 1727096122.59474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.59481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.59488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.59498: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.59501: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.59503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.59546: Set connection var ansible_timeout to 10 11792 1727096122.59555: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.59562: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.59568: Set connection var ansible_pipelining to False 11792 1727096122.59571: Set connection var ansible_shell_type to sh 11792 1727096122.59573: Set connection var ansible_connection to ssh 11792 1727096122.59589: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.59592: variable 'ansible_connection' from source: unknown 11792 1727096122.59594: variable 'ansible_module_compression' from source: unknown 11792 1727096122.59598: variable 'ansible_shell_type' from source: unknown 11792 1727096122.59600: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.59603: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.59605: variable 'ansible_pipelining' from source: unknown 11792 1727096122.59607: variable 'ansible_timeout' from source: unknown 11792 1727096122.59609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.59672: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.59679: variable 'omit' from source: magic vars 11792 1727096122.59682: starting attempt loop 11792 1727096122.59684: running the handler 11792 1727096122.59765: handler run complete 11792 1727096122.59776: attempt loop complete, returning result 11792 1727096122.59787: variable 'item' from source: unknown 11792 1727096122.59836: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 11792 1727096122.60169: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.60173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.60175: variable 'omit' from source: magic vars 11792 1727096122.60187: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.60190: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.60192: variable 'omit' from source: magic vars 11792 1727096122.60194: variable 'omit' from source: magic vars 11792 1727096122.60196: variable 'item' from source: unknown 11792 1727096122.60198: variable 'item' from source: unknown 11792 1727096122.60200: variable 'omit' from source: magic vars 11792 1727096122.60209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.60216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.60222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.60231: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.60234: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.60236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.60294: Set connection var ansible_timeout to 10 11792 1727096122.60298: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.60301: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.60305: Set connection var ansible_pipelining to False 11792 1727096122.60308: Set connection var ansible_shell_type to sh 11792 1727096122.60310: Set connection var ansible_connection to ssh 11792 1727096122.60323: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.60326: variable 'ansible_connection' from source: unknown 11792 1727096122.60328: variable 'ansible_module_compression' from source: unknown 11792 1727096122.60330: variable 'ansible_shell_type' from source: unknown 11792 1727096122.60333: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.60335: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.60339: variable 'ansible_pipelining' from source: unknown 11792 1727096122.60342: variable 'ansible_timeout' from source: unknown 11792 1727096122.60346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.60413: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.60420: variable 'omit' from source: magic vars 11792 1727096122.60422: starting attempt loop 11792 1727096122.60425: running the handler 11792 1727096122.60440: variable 'lsr_fail_debug' from source: play vars 11792 1727096122.60488: variable 'lsr_fail_debug' from source: play vars 11792 1727096122.60502: handler run complete 11792 1727096122.60514: attempt loop complete, returning result 11792 1727096122.60525: variable 'item' from source: unknown 11792 1727096122.60571: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 11792 1727096122.60660: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.60663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.60665: variable 'omit' from source: magic vars 11792 1727096122.60754: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.60761: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.60770: variable 'omit' from source: magic vars 11792 1727096122.60778: variable 'omit' from source: magic vars 11792 1727096122.60806: variable 'item' from source: unknown 11792 1727096122.60850: variable 'item' from source: unknown 11792 1727096122.60863: variable 'omit' from source: magic vars 11792 1727096122.60879: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.60885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.60893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.60900: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.60903: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.60905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.60951: Set connection var ansible_timeout to 10 11792 1727096122.60960: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.60969: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.60973: Set connection var ansible_pipelining to False 11792 1727096122.60977: Set connection var ansible_shell_type to sh 11792 1727096122.60979: Set connection var ansible_connection to ssh 11792 1727096122.60993: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.60996: variable 'ansible_connection' from source: unknown 11792 1727096122.60998: variable 'ansible_module_compression' from source: unknown 11792 1727096122.61001: variable 'ansible_shell_type' from source: unknown 11792 1727096122.61003: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.61005: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.61009: variable 'ansible_pipelining' from source: unknown 11792 1727096122.61011: variable 'ansible_timeout' from source: unknown 11792 1727096122.61016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.61082: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.61090: variable 'omit' from source: magic vars 11792 1727096122.61093: starting attempt loop 11792 1727096122.61095: running the handler 11792 1727096122.61110: variable 'lsr_cleanup' from source: include params 11792 1727096122.61154: variable 'lsr_cleanup' from source: include params 11792 1727096122.61173: handler run complete 11792 1727096122.61183: attempt loop complete, returning result 11792 1727096122.61194: variable 'item' from source: unknown 11792 1727096122.61263: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml" ] } 11792 1727096122.61345: dumping result to json 11792 1727096122.61348: done dumping result, returning 11792 1727096122.61350: done running TaskExecutor() for managed_node2/TASK: Show item [0afff68d-5257-d9c7-3fc0-000000000089] 11792 1727096122.61352: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000089 11792 1727096122.61393: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000089 11792 1727096122.61395: WORKER PROCESS EXITING 11792 1727096122.61438: no more pending results, returning what we have 11792 1727096122.61441: results queue empty 11792 1727096122.61441: checking for any_errors_fatal 11792 1727096122.61446: done checking for any_errors_fatal 11792 1727096122.61447: checking for max_fail_percentage 11792 1727096122.61448: done checking for max_fail_percentage 11792 1727096122.61449: checking to see if all hosts have failed and the running result is not ok 11792 1727096122.61449: done checking to see if all hosts have failed 11792 1727096122.61450: getting the remaining hosts for this loop 11792 1727096122.61452: done getting the remaining hosts for this loop 11792 1727096122.61454: getting the next task for host managed_node2 11792 1727096122.61460: done getting next task for host managed_node2 11792 1727096122.61462: ^ task is: TASK: Include the task 'show_interfaces.yml' 11792 1727096122.61465: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096122.61469: getting variables 11792 1727096122.61471: in VariableManager get_vars() 11792 1727096122.61497: Calling all_inventory to load vars for managed_node2 11792 1727096122.61500: Calling groups_inventory to load vars for managed_node2 11792 1727096122.61503: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.61513: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.61515: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.61518: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.61661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.61779: done with get_vars() 11792 1727096122.61788: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Monday 23 September 2024 08:55:22 -0400 (0:00:00.065) 0:00:04.897 ****** 11792 1727096122.61854: entering _queue_task() for managed_node2/include_tasks 11792 1727096122.62080: worker is 1 (out of 1 available) 11792 1727096122.62091: exiting _queue_task() for managed_node2/include_tasks 11792 1727096122.62104: done queuing things up, now waiting for results queue to drain 11792 1727096122.62105: waiting for pending results... 11792 1727096122.62259: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 11792 1727096122.62328: in run() - task 0afff68d-5257-d9c7-3fc0-00000000008a 11792 1727096122.62344: variable 'ansible_search_path' from source: unknown 11792 1727096122.62348: variable 'ansible_search_path' from source: unknown 11792 1727096122.62375: calling self._execute() 11792 1727096122.62434: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.62440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.62459: variable 'omit' from source: magic vars 11792 1727096122.62722: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.62731: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.62737: _execute() done 11792 1727096122.62740: dumping result to json 11792 1727096122.62742: done dumping result, returning 11792 1727096122.62748: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-d9c7-3fc0-00000000008a] 11792 1727096122.62754: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008a 11792 1727096122.62841: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008a 11792 1727096122.62844: WORKER PROCESS EXITING 11792 1727096122.62876: no more pending results, returning what we have 11792 1727096122.62882: in VariableManager get_vars() 11792 1727096122.62915: Calling all_inventory to load vars for managed_node2 11792 1727096122.62918: Calling groups_inventory to load vars for managed_node2 11792 1727096122.62922: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.62934: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.62936: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.62938: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.63113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.63228: done with get_vars() 11792 1727096122.63234: variable 'ansible_search_path' from source: unknown 11792 1727096122.63235: variable 'ansible_search_path' from source: unknown 11792 1727096122.63268: we have included files to process 11792 1727096122.63269: generating all_blocks data 11792 1727096122.63271: done generating all_blocks data 11792 1727096122.63273: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11792 1727096122.63274: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11792 1727096122.63275: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11792 1727096122.63379: in VariableManager get_vars() 11792 1727096122.63392: done with get_vars() 11792 1727096122.63463: done processing included file 11792 1727096122.63465: iterating over new_blocks loaded from include file 11792 1727096122.63466: in VariableManager get_vars() 11792 1727096122.63478: done with get_vars() 11792 1727096122.63479: filtering new block on tags 11792 1727096122.63514: done filtering new block on tags 11792 1727096122.63516: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 11792 1727096122.63519: extending task lists for all hosts with included blocks 11792 1727096122.63777: done extending task lists 11792 1727096122.63779: done processing included files 11792 1727096122.63779: results queue empty 11792 1727096122.63779: checking for any_errors_fatal 11792 1727096122.63784: done checking for any_errors_fatal 11792 1727096122.63785: checking for max_fail_percentage 11792 1727096122.63785: done checking for max_fail_percentage 11792 1727096122.63786: checking to see if all hosts have failed and the running result is not ok 11792 1727096122.63786: done checking to see if all hosts have failed 11792 1727096122.63787: getting the remaining hosts for this loop 11792 1727096122.63788: done getting the remaining hosts for this loop 11792 1727096122.63789: getting the next task for host managed_node2 11792 1727096122.63792: done getting next task for host managed_node2 11792 1727096122.63794: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 11792 1727096122.63797: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096122.63799: getting variables 11792 1727096122.63800: in VariableManager get_vars() 11792 1727096122.63807: Calling all_inventory to load vars for managed_node2 11792 1727096122.63808: Calling groups_inventory to load vars for managed_node2 11792 1727096122.63810: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.63814: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.63815: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.63817: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.63925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.64055: done with get_vars() 11792 1727096122.64062: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:55:22 -0400 (0:00:00.022) 0:00:04.920 ****** 11792 1727096122.64114: entering _queue_task() for managed_node2/include_tasks 11792 1727096122.64348: worker is 1 (out of 1 available) 11792 1727096122.64359: exiting _queue_task() for managed_node2/include_tasks 11792 1727096122.64373: done queuing things up, now waiting for results queue to drain 11792 1727096122.64375: waiting for pending results... 11792 1727096122.64532: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 11792 1727096122.64602: in run() - task 0afff68d-5257-d9c7-3fc0-0000000000b1 11792 1727096122.64617: variable 'ansible_search_path' from source: unknown 11792 1727096122.64621: variable 'ansible_search_path' from source: unknown 11792 1727096122.64643: calling self._execute() 11792 1727096122.64703: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.64707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.64726: variable 'omit' from source: magic vars 11792 1727096122.64999: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.65008: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.65014: _execute() done 11792 1727096122.65018: dumping result to json 11792 1727096122.65020: done dumping result, returning 11792 1727096122.65027: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-d9c7-3fc0-0000000000b1] 11792 1727096122.65030: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000000b1 11792 1727096122.65118: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000000b1 11792 1727096122.65120: WORKER PROCESS EXITING 11792 1727096122.65183: no more pending results, returning what we have 11792 1727096122.65188: in VariableManager get_vars() 11792 1727096122.65221: Calling all_inventory to load vars for managed_node2 11792 1727096122.65224: Calling groups_inventory to load vars for managed_node2 11792 1727096122.65227: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.65238: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.65241: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.65244: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.65419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.65734: done with get_vars() 11792 1727096122.65744: variable 'ansible_search_path' from source: unknown 11792 1727096122.65746: variable 'ansible_search_path' from source: unknown 11792 1727096122.65784: we have included files to process 11792 1727096122.65786: generating all_blocks data 11792 1727096122.65787: done generating all_blocks data 11792 1727096122.65789: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11792 1727096122.65790: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11792 1727096122.65793: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11792 1727096122.66622: done processing included file 11792 1727096122.66624: iterating over new_blocks loaded from include file 11792 1727096122.66627: in VariableManager get_vars() 11792 1727096122.66642: done with get_vars() 11792 1727096122.66644: filtering new block on tags 11792 1727096122.66690: done filtering new block on tags 11792 1727096122.66693: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 11792 1727096122.66698: extending task lists for all hosts with included blocks 11792 1727096122.66940: done extending task lists 11792 1727096122.66941: done processing included files 11792 1727096122.66942: results queue empty 11792 1727096122.66942: checking for any_errors_fatal 11792 1727096122.66945: done checking for any_errors_fatal 11792 1727096122.66945: checking for max_fail_percentage 11792 1727096122.66946: done checking for max_fail_percentage 11792 1727096122.66946: checking to see if all hosts have failed and the running result is not ok 11792 1727096122.66947: done checking to see if all hosts have failed 11792 1727096122.66947: getting the remaining hosts for this loop 11792 1727096122.66949: done getting the remaining hosts for this loop 11792 1727096122.66950: getting the next task for host managed_node2 11792 1727096122.66961: done getting next task for host managed_node2 11792 1727096122.66963: ^ task is: TASK: Gather current interface info 11792 1727096122.66966: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096122.66971: getting variables 11792 1727096122.66972: in VariableManager get_vars() 11792 1727096122.66995: Calling all_inventory to load vars for managed_node2 11792 1727096122.66998: Calling groups_inventory to load vars for managed_node2 11792 1727096122.66999: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096122.67004: Calling all_plugins_play to load vars for managed_node2 11792 1727096122.67005: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096122.67007: Calling groups_plugins_play to load vars for managed_node2 11792 1727096122.67124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096122.67244: done with get_vars() 11792 1727096122.67254: done getting variables 11792 1727096122.67284: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:55:22 -0400 (0:00:00.031) 0:00:04.952 ****** 11792 1727096122.67309: entering _queue_task() for managed_node2/command 11792 1727096122.67555: worker is 1 (out of 1 available) 11792 1727096122.67569: exiting _queue_task() for managed_node2/command 11792 1727096122.67583: done queuing things up, now waiting for results queue to drain 11792 1727096122.67584: waiting for pending results... 11792 1727096122.67730: running TaskExecutor() for managed_node2/TASK: Gather current interface info 11792 1727096122.67794: in run() - task 0afff68d-5257-d9c7-3fc0-0000000000ec 11792 1727096122.67808: variable 'ansible_search_path' from source: unknown 11792 1727096122.67813: variable 'ansible_search_path' from source: unknown 11792 1727096122.67842: calling self._execute() 11792 1727096122.67902: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.67907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.67916: variable 'omit' from source: magic vars 11792 1727096122.68190: variable 'ansible_distribution_major_version' from source: facts 11792 1727096122.68199: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096122.68207: variable 'omit' from source: magic vars 11792 1727096122.68286: variable 'omit' from source: magic vars 11792 1727096122.68324: variable 'omit' from source: magic vars 11792 1727096122.68365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096122.68409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096122.68441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096122.68455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.68462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096122.68518: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096122.68521: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.68525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.68672: Set connection var ansible_timeout to 10 11792 1727096122.68675: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096122.68677: Set connection var ansible_shell_executable to /bin/sh 11792 1727096122.68680: Set connection var ansible_pipelining to False 11792 1727096122.68682: Set connection var ansible_shell_type to sh 11792 1727096122.68684: Set connection var ansible_connection to ssh 11792 1727096122.68774: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.68777: variable 'ansible_connection' from source: unknown 11792 1727096122.68781: variable 'ansible_module_compression' from source: unknown 11792 1727096122.68783: variable 'ansible_shell_type' from source: unknown 11792 1727096122.68785: variable 'ansible_shell_executable' from source: unknown 11792 1727096122.68787: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096122.68789: variable 'ansible_pipelining' from source: unknown 11792 1727096122.68790: variable 'ansible_timeout' from source: unknown 11792 1727096122.68792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096122.68893: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096122.68921: variable 'omit' from source: magic vars 11792 1727096122.69022: starting attempt loop 11792 1727096122.69026: running the handler 11792 1727096122.69028: _low_level_execute_command(): starting 11792 1727096122.69030: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096122.70193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096122.70212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096122.70299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096122.70469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11792 1727096122.72889: stdout chunk (state=3): >>>/root <<< 11792 1727096122.73085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096122.73089: stdout chunk (state=3): >>><<< 11792 1727096122.73092: stderr chunk (state=3): >>><<< 11792 1727096122.73465: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11792 1727096122.73472: _low_level_execute_command(): starting 11792 1727096122.73476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895 `" && echo ansible-tmp-1727096122.7329297-12062-261339810715895="` echo /root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895 `" ) && sleep 0' 11792 1727096122.74673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096122.74784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096122.74802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096122.74871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096122.75182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096122.75213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096122.75346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096122.77382: stdout chunk (state=3): >>>ansible-tmp-1727096122.7329297-12062-261339810715895=/root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895 <<< 11792 1727096122.77479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096122.77581: stderr chunk (state=3): >>><<< 11792 1727096122.77590: stdout chunk (state=3): >>><<< 11792 1727096122.77615: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096122.7329297-12062-261339810715895=/root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096122.78075: variable 'ansible_module_compression' from source: unknown 11792 1727096122.78078: ANSIBALLZ: Using generic lock for ansible.legacy.command 11792 1727096122.78081: ANSIBALLZ: Acquiring lock 11792 1727096122.78083: ANSIBALLZ: Lock acquired: 139635227775856 11792 1727096122.78085: ANSIBALLZ: Creating module 11792 1727096123.00394: ANSIBALLZ: Writing module into payload 11792 1727096123.00492: ANSIBALLZ: Writing module 11792 1727096123.00519: ANSIBALLZ: Renaming module 11792 1727096123.00529: ANSIBALLZ: Done creating module 11792 1727096123.00551: variable 'ansible_facts' from source: unknown 11792 1727096123.00635: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/AnsiballZ_command.py 11792 1727096123.00871: Sending initial data 11792 1727096123.00881: Sent initial data (156 bytes) 11792 1727096123.01484: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.01517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096123.01544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.01569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.01641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096123.03309: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096123.03324: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11792 1727096123.03341: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096123.03405: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096123.03467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/AnsiballZ_command.py" <<< 11792 1727096123.03496: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpb6y0ehi2 /root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/AnsiballZ_command.py <<< 11792 1727096123.03530: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpb6y0ehi2" to remote "/root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/AnsiballZ_command.py" <<< 11792 1727096123.04233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096123.04271: stderr chunk (state=3): >>><<< 11792 1727096123.04373: stdout chunk (state=3): >>><<< 11792 1727096123.04377: done transferring module to remote 11792 1727096123.04379: _low_level_execute_command(): starting 11792 1727096123.04382: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/ /root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/AnsiballZ_command.py && sleep 0' 11792 1727096123.05501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096123.05505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096123.05507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096123.05510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.05555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096123.05594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.05810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.05921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096123.07776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096123.07842: stderr chunk (state=3): >>><<< 11792 1727096123.07870: stdout chunk (state=3): >>><<< 11792 1727096123.07982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096123.07987: _low_level_execute_command(): starting 11792 1727096123.07989: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/AnsiballZ_command.py && sleep 0' 11792 1727096123.08911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096123.08914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.08916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096123.08919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096123.08921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.08974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.08988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.09047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096123.25260: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:55:23.247630", "end": "2024-09-23 08:55:23.251152", "delta": "0:00:00.003522", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096123.27347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096123.27354: stdout chunk (state=3): >>><<< 11792 1727096123.27356: stderr chunk (state=3): >>><<< 11792 1727096123.27359: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:55:23.247630", "end": "2024-09-23 08:55:23.251152", "delta": "0:00:00.003522", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096123.27362: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096123.27365: _low_level_execute_command(): starting 11792 1727096123.27369: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096122.7329297-12062-261339810715895/ > /dev/null 2>&1 && sleep 0' 11792 1727096123.27997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.28017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.28044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096123.30040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096123.30097: stderr chunk (state=3): >>><<< 11792 1727096123.30275: stdout chunk (state=3): >>><<< 11792 1727096123.30279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096123.30281: handler run complete 11792 1727096123.30284: Evaluated conditional (False): False 11792 1727096123.30286: attempt loop complete, returning result 11792 1727096123.30288: _execute() done 11792 1727096123.30290: dumping result to json 11792 1727096123.30292: done dumping result, returning 11792 1727096123.30294: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-d9c7-3fc0-0000000000ec] 11792 1727096123.30296: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000000ec 11792 1727096123.30374: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000000ec 11792 1727096123.30378: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003522", "end": "2024-09-23 08:55:23.251152", "rc": 0, "start": "2024-09-23 08:55:23.247630" } STDOUT: eth0 lo 11792 1727096123.30456: no more pending results, returning what we have 11792 1727096123.30461: results queue empty 11792 1727096123.30462: checking for any_errors_fatal 11792 1727096123.30464: done checking for any_errors_fatal 11792 1727096123.30464: checking for max_fail_percentage 11792 1727096123.30466: done checking for max_fail_percentage 11792 1727096123.30469: checking to see if all hosts have failed and the running result is not ok 11792 1727096123.30470: done checking to see if all hosts have failed 11792 1727096123.30471: getting the remaining hosts for this loop 11792 1727096123.30472: done getting the remaining hosts for this loop 11792 1727096123.30476: getting the next task for host managed_node2 11792 1727096123.30485: done getting next task for host managed_node2 11792 1727096123.30487: ^ task is: TASK: Set current_interfaces 11792 1727096123.30492: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096123.30496: getting variables 11792 1727096123.30498: in VariableManager get_vars() 11792 1727096123.30530: Calling all_inventory to load vars for managed_node2 11792 1727096123.30532: Calling groups_inventory to load vars for managed_node2 11792 1727096123.30536: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096123.30548: Calling all_plugins_play to load vars for managed_node2 11792 1727096123.30554: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096123.30558: Calling groups_plugins_play to load vars for managed_node2 11792 1727096123.31398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096123.31653: done with get_vars() 11792 1727096123.31671: done getting variables 11792 1727096123.31746: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:55:23 -0400 (0:00:00.644) 0:00:05.596 ****** 11792 1727096123.31781: entering _queue_task() for managed_node2/set_fact 11792 1727096123.32018: worker is 1 (out of 1 available) 11792 1727096123.32031: exiting _queue_task() for managed_node2/set_fact 11792 1727096123.32044: done queuing things up, now waiting for results queue to drain 11792 1727096123.32046: waiting for pending results... 11792 1727096123.32197: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 11792 1727096123.32261: in run() - task 0afff68d-5257-d9c7-3fc0-0000000000ed 11792 1727096123.32275: variable 'ansible_search_path' from source: unknown 11792 1727096123.32279: variable 'ansible_search_path' from source: unknown 11792 1727096123.32309: calling self._execute() 11792 1727096123.32366: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.32372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.32382: variable 'omit' from source: magic vars 11792 1727096123.32642: variable 'ansible_distribution_major_version' from source: facts 11792 1727096123.32651: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096123.32659: variable 'omit' from source: magic vars 11792 1727096123.32697: variable 'omit' from source: magic vars 11792 1727096123.32774: variable '_current_interfaces' from source: set_fact 11792 1727096123.32822: variable 'omit' from source: magic vars 11792 1727096123.32853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096123.32884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096123.32899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096123.32912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096123.32925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096123.32946: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096123.32949: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.32955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.33021: Set connection var ansible_timeout to 10 11792 1727096123.33029: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096123.33038: Set connection var ansible_shell_executable to /bin/sh 11792 1727096123.33045: Set connection var ansible_pipelining to False 11792 1727096123.33048: Set connection var ansible_shell_type to sh 11792 1727096123.33050: Set connection var ansible_connection to ssh 11792 1727096123.33066: variable 'ansible_shell_executable' from source: unknown 11792 1727096123.33072: variable 'ansible_connection' from source: unknown 11792 1727096123.33074: variable 'ansible_module_compression' from source: unknown 11792 1727096123.33076: variable 'ansible_shell_type' from source: unknown 11792 1727096123.33079: variable 'ansible_shell_executable' from source: unknown 11792 1727096123.33081: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.33083: variable 'ansible_pipelining' from source: unknown 11792 1727096123.33086: variable 'ansible_timeout' from source: unknown 11792 1727096123.33090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.33193: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096123.33201: variable 'omit' from source: magic vars 11792 1727096123.33206: starting attempt loop 11792 1727096123.33209: running the handler 11792 1727096123.33218: handler run complete 11792 1727096123.33226: attempt loop complete, returning result 11792 1727096123.33228: _execute() done 11792 1727096123.33231: dumping result to json 11792 1727096123.33233: done dumping result, returning 11792 1727096123.33240: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-d9c7-3fc0-0000000000ed] 11792 1727096123.33243: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000000ed 11792 1727096123.33324: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000000ed 11792 1727096123.33327: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 11792 1727096123.33410: no more pending results, returning what we have 11792 1727096123.33413: results queue empty 11792 1727096123.33414: checking for any_errors_fatal 11792 1727096123.33420: done checking for any_errors_fatal 11792 1727096123.33421: checking for max_fail_percentage 11792 1727096123.33423: done checking for max_fail_percentage 11792 1727096123.33423: checking to see if all hosts have failed and the running result is not ok 11792 1727096123.33424: done checking to see if all hosts have failed 11792 1727096123.33425: getting the remaining hosts for this loop 11792 1727096123.33427: done getting the remaining hosts for this loop 11792 1727096123.33430: getting the next task for host managed_node2 11792 1727096123.33441: done getting next task for host managed_node2 11792 1727096123.33443: ^ task is: TASK: Show current_interfaces 11792 1727096123.33446: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096123.33450: getting variables 11792 1727096123.33451: in VariableManager get_vars() 11792 1727096123.33481: Calling all_inventory to load vars for managed_node2 11792 1727096123.33484: Calling groups_inventory to load vars for managed_node2 11792 1727096123.33487: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096123.33495: Calling all_plugins_play to load vars for managed_node2 11792 1727096123.33497: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096123.33500: Calling groups_plugins_play to load vars for managed_node2 11792 1727096123.33744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096123.33943: done with get_vars() 11792 1727096123.33955: done getting variables 11792 1727096123.34012: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:55:23 -0400 (0:00:00.022) 0:00:05.619 ****** 11792 1727096123.34044: entering _queue_task() for managed_node2/debug 11792 1727096123.34299: worker is 1 (out of 1 available) 11792 1727096123.34311: exiting _queue_task() for managed_node2/debug 11792 1727096123.34324: done queuing things up, now waiting for results queue to drain 11792 1727096123.34326: waiting for pending results... 11792 1727096123.34613: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 11792 1727096123.34648: in run() - task 0afff68d-5257-d9c7-3fc0-0000000000b2 11792 1727096123.34662: variable 'ansible_search_path' from source: unknown 11792 1727096123.34681: variable 'ansible_search_path' from source: unknown 11792 1727096123.34699: calling self._execute() 11792 1727096123.34760: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.34765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.34775: variable 'omit' from source: magic vars 11792 1727096123.35036: variable 'ansible_distribution_major_version' from source: facts 11792 1727096123.35045: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096123.35051: variable 'omit' from source: magic vars 11792 1727096123.35087: variable 'omit' from source: magic vars 11792 1727096123.35157: variable 'current_interfaces' from source: set_fact 11792 1727096123.35182: variable 'omit' from source: magic vars 11792 1727096123.35211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096123.35241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096123.35259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096123.35273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096123.35282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096123.35304: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096123.35307: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.35309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.35382: Set connection var ansible_timeout to 10 11792 1727096123.35387: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096123.35395: Set connection var ansible_shell_executable to /bin/sh 11792 1727096123.35400: Set connection var ansible_pipelining to False 11792 1727096123.35402: Set connection var ansible_shell_type to sh 11792 1727096123.35405: Set connection var ansible_connection to ssh 11792 1727096123.35420: variable 'ansible_shell_executable' from source: unknown 11792 1727096123.35423: variable 'ansible_connection' from source: unknown 11792 1727096123.35426: variable 'ansible_module_compression' from source: unknown 11792 1727096123.35429: variable 'ansible_shell_type' from source: unknown 11792 1727096123.35431: variable 'ansible_shell_executable' from source: unknown 11792 1727096123.35433: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.35438: variable 'ansible_pipelining' from source: unknown 11792 1727096123.35440: variable 'ansible_timeout' from source: unknown 11792 1727096123.35442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.35545: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096123.35560: variable 'omit' from source: magic vars 11792 1727096123.35563: starting attempt loop 11792 1727096123.35566: running the handler 11792 1727096123.35601: handler run complete 11792 1727096123.35611: attempt loop complete, returning result 11792 1727096123.35614: _execute() done 11792 1727096123.35617: dumping result to json 11792 1727096123.35619: done dumping result, returning 11792 1727096123.35625: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-d9c7-3fc0-0000000000b2] 11792 1727096123.35628: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000000b2 11792 1727096123.35711: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000000b2 11792 1727096123.35714: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['eth0', 'lo'] 11792 1727096123.35760: no more pending results, returning what we have 11792 1727096123.35764: results queue empty 11792 1727096123.35764: checking for any_errors_fatal 11792 1727096123.35773: done checking for any_errors_fatal 11792 1727096123.35774: checking for max_fail_percentage 11792 1727096123.35776: done checking for max_fail_percentage 11792 1727096123.35776: checking to see if all hosts have failed and the running result is not ok 11792 1727096123.35777: done checking to see if all hosts have failed 11792 1727096123.35778: getting the remaining hosts for this loop 11792 1727096123.35779: done getting the remaining hosts for this loop 11792 1727096123.35783: getting the next task for host managed_node2 11792 1727096123.35790: done getting next task for host managed_node2 11792 1727096123.35793: ^ task is: TASK: Setup 11792 1727096123.35795: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096123.35799: getting variables 11792 1727096123.35801: in VariableManager get_vars() 11792 1727096123.35833: Calling all_inventory to load vars for managed_node2 11792 1727096123.35836: Calling groups_inventory to load vars for managed_node2 11792 1727096123.35839: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096123.35848: Calling all_plugins_play to load vars for managed_node2 11792 1727096123.35850: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096123.35853: Calling groups_plugins_play to load vars for managed_node2 11792 1727096123.36013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096123.36132: done with get_vars() 11792 1727096123.36139: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Monday 23 September 2024 08:55:23 -0400 (0:00:00.021) 0:00:05.641 ****** 11792 1727096123.36202: entering _queue_task() for managed_node2/include_tasks 11792 1727096123.36401: worker is 1 (out of 1 available) 11792 1727096123.36415: exiting _queue_task() for managed_node2/include_tasks 11792 1727096123.36427: done queuing things up, now waiting for results queue to drain 11792 1727096123.36429: waiting for pending results... 11792 1727096123.36607: running TaskExecutor() for managed_node2/TASK: Setup 11792 1727096123.36772: in run() - task 0afff68d-5257-d9c7-3fc0-00000000008b 11792 1727096123.36776: variable 'ansible_search_path' from source: unknown 11792 1727096123.36780: variable 'ansible_search_path' from source: unknown 11792 1727096123.36783: variable 'lsr_setup' from source: include params 11792 1727096123.36974: variable 'lsr_setup' from source: include params 11792 1727096123.37047: variable 'omit' from source: magic vars 11792 1727096123.37166: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.37185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.37201: variable 'omit' from source: magic vars 11792 1727096123.37435: variable 'ansible_distribution_major_version' from source: facts 11792 1727096123.37453: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096123.37466: variable 'item' from source: unknown 11792 1727096123.37534: variable 'item' from source: unknown 11792 1727096123.37772: variable 'item' from source: unknown 11792 1727096123.37775: variable 'item' from source: unknown 11792 1727096123.37881: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.37888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.37891: variable 'omit' from source: magic vars 11792 1727096123.37994: variable 'ansible_distribution_major_version' from source: facts 11792 1727096123.38007: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096123.38017: variable 'item' from source: unknown 11792 1727096123.38085: variable 'item' from source: unknown 11792 1727096123.38124: variable 'item' from source: unknown 11792 1727096123.38206: variable 'item' from source: unknown 11792 1727096123.38265: dumping result to json 11792 1727096123.38270: done dumping result, returning 11792 1727096123.38272: done running TaskExecutor() for managed_node2/TASK: Setup [0afff68d-5257-d9c7-3fc0-00000000008b] 11792 1727096123.38275: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008b 11792 1727096123.38316: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008b 11792 1727096123.38319: WORKER PROCESS EXITING 11792 1727096123.38393: no more pending results, returning what we have 11792 1727096123.38398: in VariableManager get_vars() 11792 1727096123.38431: Calling all_inventory to load vars for managed_node2 11792 1727096123.38434: Calling groups_inventory to load vars for managed_node2 11792 1727096123.38437: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096123.38450: Calling all_plugins_play to load vars for managed_node2 11792 1727096123.38453: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096123.38456: Calling groups_plugins_play to load vars for managed_node2 11792 1727096123.38828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096123.39008: done with get_vars() 11792 1727096123.39015: variable 'ansible_search_path' from source: unknown 11792 1727096123.39016: variable 'ansible_search_path' from source: unknown 11792 1727096123.39052: variable 'ansible_search_path' from source: unknown 11792 1727096123.39053: variable 'ansible_search_path' from source: unknown 11792 1727096123.39080: we have included files to process 11792 1727096123.39081: generating all_blocks data 11792 1727096123.39082: done generating all_blocks data 11792 1727096123.39086: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11792 1727096123.39087: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11792 1727096123.39089: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11792 1727096123.40290: done processing included file 11792 1727096123.40292: iterating over new_blocks loaded from include file 11792 1727096123.40293: in VariableManager get_vars() 11792 1727096123.40306: done with get_vars() 11792 1727096123.40307: filtering new block on tags 11792 1727096123.40362: done filtering new block on tags 11792 1727096123.40365: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node2 => (item=tasks/create_test_interfaces_with_dhcp.yml) 11792 1727096123.40374: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11792 1727096123.40376: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11792 1727096123.40379: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11792 1727096123.40513: in VariableManager get_vars() 11792 1727096123.40541: done with get_vars() 11792 1727096123.40548: variable 'item' from source: include params 11792 1727096123.40664: variable 'item' from source: include params 11792 1727096123.40700: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11792 1727096123.40799: in VariableManager get_vars() 11792 1727096123.40811: done with get_vars() 11792 1727096123.40935: in VariableManager get_vars() 11792 1727096123.40954: done with get_vars() 11792 1727096123.40960: variable 'item' from source: include params 11792 1727096123.41021: variable 'item' from source: include params 11792 1727096123.41054: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11792 1727096123.41127: in VariableManager get_vars() 11792 1727096123.41143: done with get_vars() 11792 1727096123.41241: done processing included file 11792 1727096123.41243: iterating over new_blocks loaded from include file 11792 1727096123.41245: in VariableManager get_vars() 11792 1727096123.41261: done with get_vars() 11792 1727096123.41263: filtering new block on tags 11792 1727096123.41366: done filtering new block on tags 11792 1727096123.41371: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node2 => (item=tasks/assert_dhcp_device_present.yml) 11792 1727096123.41375: extending task lists for all hosts with included blocks 11792 1727096123.41955: done extending task lists 11792 1727096123.41956: done processing included files 11792 1727096123.41957: results queue empty 11792 1727096123.41958: checking for any_errors_fatal 11792 1727096123.41961: done checking for any_errors_fatal 11792 1727096123.41962: checking for max_fail_percentage 11792 1727096123.41966: done checking for max_fail_percentage 11792 1727096123.41968: checking to see if all hosts have failed and the running result is not ok 11792 1727096123.41969: done checking to see if all hosts have failed 11792 1727096123.41974: getting the remaining hosts for this loop 11792 1727096123.41975: done getting the remaining hosts for this loop 11792 1727096123.41977: getting the next task for host managed_node2 11792 1727096123.41980: done getting next task for host managed_node2 11792 1727096123.41981: ^ task is: TASK: Install dnsmasq 11792 1727096123.41983: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096123.41985: getting variables 11792 1727096123.41986: in VariableManager get_vars() 11792 1727096123.41992: Calling all_inventory to load vars for managed_node2 11792 1727096123.41993: Calling groups_inventory to load vars for managed_node2 11792 1727096123.41995: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096123.41999: Calling all_plugins_play to load vars for managed_node2 11792 1727096123.42000: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096123.42002: Calling groups_plugins_play to load vars for managed_node2 11792 1727096123.42109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096123.42226: done with get_vars() 11792 1727096123.42232: done getting variables 11792 1727096123.42264: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:55:23 -0400 (0:00:00.060) 0:00:05.702 ****** 11792 1727096123.42287: entering _queue_task() for managed_node2/package 11792 1727096123.42514: worker is 1 (out of 1 available) 11792 1727096123.42527: exiting _queue_task() for managed_node2/package 11792 1727096123.42540: done queuing things up, now waiting for results queue to drain 11792 1727096123.42541: waiting for pending results... 11792 1727096123.42695: running TaskExecutor() for managed_node2/TASK: Install dnsmasq 11792 1727096123.42754: in run() - task 0afff68d-5257-d9c7-3fc0-000000000112 11792 1727096123.42764: variable 'ansible_search_path' from source: unknown 11792 1727096123.42769: variable 'ansible_search_path' from source: unknown 11792 1727096123.42797: calling self._execute() 11792 1727096123.42890: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.42894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.42904: variable 'omit' from source: magic vars 11792 1727096123.43156: variable 'ansible_distribution_major_version' from source: facts 11792 1727096123.43164: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096123.43171: variable 'omit' from source: magic vars 11792 1727096123.43202: variable 'omit' from source: magic vars 11792 1727096123.43335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096123.45179: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096123.45231: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096123.45262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096123.45289: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096123.45308: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096123.45384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096123.45403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096123.45420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096123.45447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096123.45461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096123.45536: variable '__network_is_ostree' from source: set_fact 11792 1727096123.45540: variable 'omit' from source: magic vars 11792 1727096123.45565: variable 'omit' from source: magic vars 11792 1727096123.45591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096123.45611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096123.45626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096123.45638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096123.45647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096123.45673: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096123.45676: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.45680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.45745: Set connection var ansible_timeout to 10 11792 1727096123.45752: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096123.45761: Set connection var ansible_shell_executable to /bin/sh 11792 1727096123.45766: Set connection var ansible_pipelining to False 11792 1727096123.45771: Set connection var ansible_shell_type to sh 11792 1727096123.45773: Set connection var ansible_connection to ssh 11792 1727096123.45790: variable 'ansible_shell_executable' from source: unknown 11792 1727096123.45795: variable 'ansible_connection' from source: unknown 11792 1727096123.45797: variable 'ansible_module_compression' from source: unknown 11792 1727096123.45800: variable 'ansible_shell_type' from source: unknown 11792 1727096123.45802: variable 'ansible_shell_executable' from source: unknown 11792 1727096123.45804: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096123.45806: variable 'ansible_pipelining' from source: unknown 11792 1727096123.45808: variable 'ansible_timeout' from source: unknown 11792 1727096123.45818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096123.45887: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096123.45898: variable 'omit' from source: magic vars 11792 1727096123.45902: starting attempt loop 11792 1727096123.45906: running the handler 11792 1727096123.45912: variable 'ansible_facts' from source: unknown 11792 1727096123.45914: variable 'ansible_facts' from source: unknown 11792 1727096123.45958: _low_level_execute_command(): starting 11792 1727096123.45963: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096123.46458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096123.46470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.46474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096123.46477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.46521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096123.46524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.46526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.46575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096123.48271: stdout chunk (state=3): >>>/root <<< 11792 1727096123.48361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096123.48399: stderr chunk (state=3): >>><<< 11792 1727096123.48402: stdout chunk (state=3): >>><<< 11792 1727096123.48424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096123.48435: _low_level_execute_command(): starting 11792 1727096123.48441: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049 `" && echo ansible-tmp-1727096123.484235-12103-137794156391049="` echo /root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049 `" ) && sleep 0' 11792 1727096123.48910: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096123.48913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096123.48916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.48919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096123.48921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.48975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096123.48979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.48994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.49021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096123.51011: stdout chunk (state=3): >>>ansible-tmp-1727096123.484235-12103-137794156391049=/root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049 <<< 11792 1727096123.51113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096123.51143: stderr chunk (state=3): >>><<< 11792 1727096123.51146: stdout chunk (state=3): >>><<< 11792 1727096123.51163: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096123.484235-12103-137794156391049=/root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096123.51196: variable 'ansible_module_compression' from source: unknown 11792 1727096123.51247: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11792 1727096123.51253: ANSIBALLZ: Acquiring lock 11792 1727096123.51256: ANSIBALLZ: Lock acquired: 139635227775856 11792 1727096123.51258: ANSIBALLZ: Creating module 11792 1727096123.62475: ANSIBALLZ: Writing module into payload 11792 1727096123.62569: ANSIBALLZ: Writing module 11792 1727096123.62597: ANSIBALLZ: Renaming module 11792 1727096123.62614: ANSIBALLZ: Done creating module 11792 1727096123.62639: variable 'ansible_facts' from source: unknown 11792 1727096123.62743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/AnsiballZ_dnf.py 11792 1727096123.62865: Sending initial data 11792 1727096123.62871: Sent initial data (151 bytes) 11792 1727096123.63338: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096123.63341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096123.63345: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096123.63347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.63400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096123.63404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.63408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.63453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096123.65327: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096123.65375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096123.65379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp87hzmf1o" to remote "/root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/AnsiballZ_dnf.py" <<< 11792 1727096123.65382: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp87hzmf1o /root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/AnsiballZ_dnf.py <<< 11792 1727096123.66445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096123.66452: stdout chunk (state=3): >>><<< 11792 1727096123.66455: stderr chunk (state=3): >>><<< 11792 1727096123.66461: done transferring module to remote 11792 1727096123.66463: _low_level_execute_command(): starting 11792 1727096123.66465: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/ /root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/AnsiballZ_dnf.py && sleep 0' 11792 1727096123.67118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096123.67184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096123.67247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096123.67270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.67292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.67366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096123.69487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096123.69491: stdout chunk (state=3): >>><<< 11792 1727096123.69494: stderr chunk (state=3): >>><<< 11792 1727096123.69633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096123.69637: _low_level_execute_command(): starting 11792 1727096123.69639: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/AnsiballZ_dnf.py && sleep 0' 11792 1727096123.70202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096123.70221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096123.70238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096123.70258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096123.70277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096123.70288: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096123.70385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096123.70407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096123.70424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096123.70500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096125.50076: stdout chunk (state=3): >>> <<< 11792 1727096125.50082: stdout chunk (state=3): >>>{"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11792 1727096125.56113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096125.56138: stderr chunk (state=3): >>><<< 11792 1727096125.56141: stdout chunk (state=3): >>><<< 11792 1727096125.56161: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096125.56200: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096125.56206: _low_level_execute_command(): starting 11792 1727096125.56211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096123.484235-12103-137794156391049/ > /dev/null 2>&1 && sleep 0' 11792 1727096125.56679: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096125.56682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096125.56684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096125.56686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096125.56745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096125.56752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096125.56754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096125.56788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096125.58740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096125.58770: stderr chunk (state=3): >>><<< 11792 1727096125.58774: stdout chunk (state=3): >>><<< 11792 1727096125.58789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096125.58798: handler run complete 11792 1727096125.58914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096125.59040: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096125.59071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096125.59093: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096125.59118: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096125.59171: variable '__install_status' from source: unknown 11792 1727096125.59185: Evaluated conditional (__install_status is success): True 11792 1727096125.59197: attempt loop complete, returning result 11792 1727096125.59200: _execute() done 11792 1727096125.59203: dumping result to json 11792 1727096125.59209: done dumping result, returning 11792 1727096125.59217: done running TaskExecutor() for managed_node2/TASK: Install dnsmasq [0afff68d-5257-d9c7-3fc0-000000000112] 11792 1727096125.59220: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000112 11792 1727096125.59360: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000112 11792 1727096125.59362: WORKER PROCESS EXITING changed: [managed_node2] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-3.el10.x86_64" ] } 11792 1727096125.59440: no more pending results, returning what we have 11792 1727096125.59444: results queue empty 11792 1727096125.59445: checking for any_errors_fatal 11792 1727096125.59446: done checking for any_errors_fatal 11792 1727096125.59447: checking for max_fail_percentage 11792 1727096125.59448: done checking for max_fail_percentage 11792 1727096125.59449: checking to see if all hosts have failed and the running result is not ok 11792 1727096125.59452: done checking to see if all hosts have failed 11792 1727096125.59452: getting the remaining hosts for this loop 11792 1727096125.59454: done getting the remaining hosts for this loop 11792 1727096125.59457: getting the next task for host managed_node2 11792 1727096125.59462: done getting next task for host managed_node2 11792 1727096125.59464: ^ task is: TASK: Install pgrep, sysctl 11792 1727096125.59475: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096125.59479: getting variables 11792 1727096125.59480: in VariableManager get_vars() 11792 1727096125.59505: Calling all_inventory to load vars for managed_node2 11792 1727096125.59508: Calling groups_inventory to load vars for managed_node2 11792 1727096125.59511: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096125.59520: Calling all_plugins_play to load vars for managed_node2 11792 1727096125.59523: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096125.59525: Calling groups_plugins_play to load vars for managed_node2 11792 1727096125.59673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096125.59791: done with get_vars() 11792 1727096125.59805: done getting variables 11792 1727096125.59847: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Monday 23 September 2024 08:55:25 -0400 (0:00:02.175) 0:00:07.877 ****** 11792 1727096125.59873: entering _queue_task() for managed_node2/package 11792 1727096125.60087: worker is 1 (out of 1 available) 11792 1727096125.60099: exiting _queue_task() for managed_node2/package 11792 1727096125.60111: done queuing things up, now waiting for results queue to drain 11792 1727096125.60113: waiting for pending results... 11792 1727096125.60266: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11792 1727096125.60328: in run() - task 0afff68d-5257-d9c7-3fc0-000000000113 11792 1727096125.60343: variable 'ansible_search_path' from source: unknown 11792 1727096125.60347: variable 'ansible_search_path' from source: unknown 11792 1727096125.60375: calling self._execute() 11792 1727096125.60439: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096125.60477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096125.60480: variable 'omit' from source: magic vars 11792 1727096125.60828: variable 'ansible_distribution_major_version' from source: facts 11792 1727096125.60849: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096125.61173: variable 'ansible_os_family' from source: facts 11792 1727096125.61176: Evaluated conditional (ansible_os_family == 'RedHat'): True 11792 1727096125.61218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096125.61519: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096125.61563: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096125.61601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096125.61636: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096125.61717: variable 'ansible_distribution_major_version' from source: facts 11792 1727096125.61733: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11792 1727096125.61740: when evaluation is False, skipping this task 11792 1727096125.61747: _execute() done 11792 1727096125.61754: dumping result to json 11792 1727096125.61761: done dumping result, returning 11792 1727096125.61774: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [0afff68d-5257-d9c7-3fc0-000000000113] 11792 1727096125.61782: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000113 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11792 1727096125.61958: no more pending results, returning what we have 11792 1727096125.61962: results queue empty 11792 1727096125.61962: checking for any_errors_fatal 11792 1727096125.61973: done checking for any_errors_fatal 11792 1727096125.61974: checking for max_fail_percentage 11792 1727096125.61976: done checking for max_fail_percentage 11792 1727096125.61977: checking to see if all hosts have failed and the running result is not ok 11792 1727096125.61977: done checking to see if all hosts have failed 11792 1727096125.61978: getting the remaining hosts for this loop 11792 1727096125.61979: done getting the remaining hosts for this loop 11792 1727096125.61983: getting the next task for host managed_node2 11792 1727096125.61990: done getting next task for host managed_node2 11792 1727096125.61992: ^ task is: TASK: Install pgrep, sysctl 11792 1727096125.61995: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096125.62000: getting variables 11792 1727096125.62001: in VariableManager get_vars() 11792 1727096125.62034: Calling all_inventory to load vars for managed_node2 11792 1727096125.62036: Calling groups_inventory to load vars for managed_node2 11792 1727096125.62040: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096125.62053: Calling all_plugins_play to load vars for managed_node2 11792 1727096125.62056: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096125.62058: Calling groups_plugins_play to load vars for managed_node2 11792 1727096125.62249: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000113 11792 1727096125.62252: WORKER PROCESS EXITING 11792 1727096125.62277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096125.62501: done with get_vars() 11792 1727096125.62511: done getting variables 11792 1727096125.62565: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Monday 23 September 2024 08:55:25 -0400 (0:00:00.027) 0:00:07.905 ****** 11792 1727096125.62598: entering _queue_task() for managed_node2/package 11792 1727096125.62866: worker is 1 (out of 1 available) 11792 1727096125.62882: exiting _queue_task() for managed_node2/package 11792 1727096125.62894: done queuing things up, now waiting for results queue to drain 11792 1727096125.62896: waiting for pending results... 11792 1727096125.63141: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11792 1727096125.63251: in run() - task 0afff68d-5257-d9c7-3fc0-000000000114 11792 1727096125.63277: variable 'ansible_search_path' from source: unknown 11792 1727096125.63285: variable 'ansible_search_path' from source: unknown 11792 1727096125.63322: calling self._execute() 11792 1727096125.63402: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096125.63414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096125.63427: variable 'omit' from source: magic vars 11792 1727096125.63772: variable 'ansible_distribution_major_version' from source: facts 11792 1727096125.63788: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096125.63909: variable 'ansible_os_family' from source: facts 11792 1727096125.63920: Evaluated conditional (ansible_os_family == 'RedHat'): True 11792 1727096125.64090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096125.64362: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096125.64409: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096125.64448: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096125.64566: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096125.64571: variable 'ansible_distribution_major_version' from source: facts 11792 1727096125.64589: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11792 1727096125.64598: variable 'omit' from source: magic vars 11792 1727096125.64646: variable 'omit' from source: magic vars 11792 1727096125.64808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096125.67154: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096125.67224: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096125.67263: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096125.67305: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096125.67339: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096125.67438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096125.67473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096125.67606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096125.67610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096125.67612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096125.67666: variable '__network_is_ostree' from source: set_fact 11792 1727096125.67679: variable 'omit' from source: magic vars 11792 1727096125.67715: variable 'omit' from source: magic vars 11792 1727096125.67746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096125.67779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096125.67801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096125.67826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096125.67840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096125.67876: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096125.67886: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096125.67895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096125.68005: Set connection var ansible_timeout to 10 11792 1727096125.68022: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096125.68043: Set connection var ansible_shell_executable to /bin/sh 11792 1727096125.68055: Set connection var ansible_pipelining to False 11792 1727096125.68063: Set connection var ansible_shell_type to sh 11792 1727096125.68072: Set connection var ansible_connection to ssh 11792 1727096125.68121: variable 'ansible_shell_executable' from source: unknown 11792 1727096125.68149: variable 'ansible_connection' from source: unknown 11792 1727096125.68152: variable 'ansible_module_compression' from source: unknown 11792 1727096125.68154: variable 'ansible_shell_type' from source: unknown 11792 1727096125.68157: variable 'ansible_shell_executable' from source: unknown 11792 1727096125.68159: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096125.68160: variable 'ansible_pipelining' from source: unknown 11792 1727096125.68258: variable 'ansible_timeout' from source: unknown 11792 1727096125.68261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096125.68281: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096125.68295: variable 'omit' from source: magic vars 11792 1727096125.68304: starting attempt loop 11792 1727096125.68311: running the handler 11792 1727096125.68320: variable 'ansible_facts' from source: unknown 11792 1727096125.68325: variable 'ansible_facts' from source: unknown 11792 1727096125.68383: _low_level_execute_command(): starting 11792 1727096125.68397: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096125.69094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096125.69111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096125.69131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096125.69237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096125.69266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096125.69285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096125.69360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096125.71066: stdout chunk (state=3): >>>/root <<< 11792 1727096125.71202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096125.71231: stdout chunk (state=3): >>><<< 11792 1727096125.71235: stderr chunk (state=3): >>><<< 11792 1727096125.71253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096125.71273: _low_level_execute_command(): starting 11792 1727096125.71283: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928 `" && echo ansible-tmp-1727096125.7126005-12211-82745203947928="` echo /root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928 `" ) && sleep 0' 11792 1727096125.71919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096125.71934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096125.71991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096125.72064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096125.72086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096125.72106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096125.72177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096125.74181: stdout chunk (state=3): >>>ansible-tmp-1727096125.7126005-12211-82745203947928=/root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928 <<< 11792 1727096125.74339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096125.74343: stdout chunk (state=3): >>><<< 11792 1727096125.74346: stderr chunk (state=3): >>><<< 11792 1727096125.74370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096125.7126005-12211-82745203947928=/root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096125.74408: variable 'ansible_module_compression' from source: unknown 11792 1727096125.74553: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11792 1727096125.74557: variable 'ansible_facts' from source: unknown 11792 1727096125.74687: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/AnsiballZ_dnf.py 11792 1727096125.74946: Sending initial data 11792 1727096125.74949: Sent initial data (151 bytes) 11792 1727096125.75589: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096125.75622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096125.75636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096125.75656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096125.75727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096125.77406: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096125.77470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096125.77505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpjys7yf87" to remote "/root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/AnsiballZ_dnf.py" <<< 11792 1727096125.77685: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpjys7yf87 /root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/AnsiballZ_dnf.py <<< 11792 1727096125.78446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096125.78527: stderr chunk (state=3): >>><<< 11792 1727096125.78539: stdout chunk (state=3): >>><<< 11792 1727096125.78573: done transferring module to remote 11792 1727096125.78594: _low_level_execute_command(): starting 11792 1727096125.78603: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/ /root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/AnsiballZ_dnf.py && sleep 0' 11792 1727096125.79245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096125.79263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096125.79278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096125.79330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096125.79397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096125.79414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096125.79441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096125.79505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096125.81428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096125.81441: stdout chunk (state=3): >>><<< 11792 1727096125.81457: stderr chunk (state=3): >>><<< 11792 1727096125.81568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096125.81573: _low_level_execute_command(): starting 11792 1727096125.81576: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/AnsiballZ_dnf.py && sleep 0' 11792 1727096125.82154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096125.82171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096125.82183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096125.82289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096126.25620: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11792 1727096126.29912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096126.29940: stderr chunk (state=3): >>><<< 11792 1727096126.29943: stdout chunk (state=3): >>><<< 11792 1727096126.29962: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096126.29997: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096126.30003: _low_level_execute_command(): starting 11792 1727096126.30008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096125.7126005-12211-82745203947928/ > /dev/null 2>&1 && sleep 0' 11792 1727096126.30476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096126.30481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.30484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096126.30487: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096126.30489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096126.30491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.30538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096126.30541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096126.30548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096126.30584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096126.32451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096126.32478: stderr chunk (state=3): >>><<< 11792 1727096126.32481: stdout chunk (state=3): >>><<< 11792 1727096126.32495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096126.32502: handler run complete 11792 1727096126.32529: attempt loop complete, returning result 11792 1727096126.32532: _execute() done 11792 1727096126.32534: dumping result to json 11792 1727096126.32538: done dumping result, returning 11792 1727096126.32546: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [0afff68d-5257-d9c7-3fc0-000000000114] 11792 1727096126.32549: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000114 11792 1727096126.32645: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000114 11792 1727096126.32649: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11792 1727096126.32720: no more pending results, returning what we have 11792 1727096126.32724: results queue empty 11792 1727096126.32724: checking for any_errors_fatal 11792 1727096126.32729: done checking for any_errors_fatal 11792 1727096126.32730: checking for max_fail_percentage 11792 1727096126.32731: done checking for max_fail_percentage 11792 1727096126.32732: checking to see if all hosts have failed and the running result is not ok 11792 1727096126.32733: done checking to see if all hosts have failed 11792 1727096126.32733: getting the remaining hosts for this loop 11792 1727096126.32735: done getting the remaining hosts for this loop 11792 1727096126.32738: getting the next task for host managed_node2 11792 1727096126.32744: done getting next task for host managed_node2 11792 1727096126.32746: ^ task is: TASK: Create test interfaces 11792 1727096126.32749: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096126.32753: getting variables 11792 1727096126.32754: in VariableManager get_vars() 11792 1727096126.32791: Calling all_inventory to load vars for managed_node2 11792 1727096126.32794: Calling groups_inventory to load vars for managed_node2 11792 1727096126.32797: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096126.32808: Calling all_plugins_play to load vars for managed_node2 11792 1727096126.32811: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096126.32813: Calling groups_plugins_play to load vars for managed_node2 11792 1727096126.32954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096126.33073: done with get_vars() 11792 1727096126.33081: done getting variables 11792 1727096126.33146: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Monday 23 September 2024 08:55:26 -0400 (0:00:00.705) 0:00:08.610 ****** 11792 1727096126.33170: entering _queue_task() for managed_node2/shell 11792 1727096126.33175: Creating lock for shell 11792 1727096126.33373: worker is 1 (out of 1 available) 11792 1727096126.33387: exiting _queue_task() for managed_node2/shell 11792 1727096126.33399: done queuing things up, now waiting for results queue to drain 11792 1727096126.33401: waiting for pending results... 11792 1727096126.33551: running TaskExecutor() for managed_node2/TASK: Create test interfaces 11792 1727096126.33615: in run() - task 0afff68d-5257-d9c7-3fc0-000000000115 11792 1727096126.33626: variable 'ansible_search_path' from source: unknown 11792 1727096126.33631: variable 'ansible_search_path' from source: unknown 11792 1727096126.33660: calling self._execute() 11792 1727096126.33715: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096126.33720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096126.33730: variable 'omit' from source: magic vars 11792 1727096126.33992: variable 'ansible_distribution_major_version' from source: facts 11792 1727096126.34001: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096126.34007: variable 'omit' from source: magic vars 11792 1727096126.34034: variable 'omit' from source: magic vars 11792 1727096126.34322: variable 'dhcp_interface1' from source: play vars 11792 1727096126.34326: variable 'dhcp_interface2' from source: play vars 11792 1727096126.34342: variable 'omit' from source: magic vars 11792 1727096126.34376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096126.34404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096126.34420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096126.34433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096126.34441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096126.34466: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096126.34471: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096126.34473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096126.34542: Set connection var ansible_timeout to 10 11792 1727096126.34548: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096126.34559: Set connection var ansible_shell_executable to /bin/sh 11792 1727096126.34563: Set connection var ansible_pipelining to False 11792 1727096126.34566: Set connection var ansible_shell_type to sh 11792 1727096126.34570: Set connection var ansible_connection to ssh 11792 1727096126.34586: variable 'ansible_shell_executable' from source: unknown 11792 1727096126.34589: variable 'ansible_connection' from source: unknown 11792 1727096126.34591: variable 'ansible_module_compression' from source: unknown 11792 1727096126.34594: variable 'ansible_shell_type' from source: unknown 11792 1727096126.34596: variable 'ansible_shell_executable' from source: unknown 11792 1727096126.34598: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096126.34604: variable 'ansible_pipelining' from source: unknown 11792 1727096126.34607: variable 'ansible_timeout' from source: unknown 11792 1727096126.34611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096126.34709: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096126.34716: variable 'omit' from source: magic vars 11792 1727096126.34723: starting attempt loop 11792 1727096126.34726: running the handler 11792 1727096126.34734: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096126.34748: _low_level_execute_command(): starting 11792 1727096126.34758: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096126.35262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096126.35266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.35272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096126.35274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.35323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096126.35326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096126.35328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096126.35374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096126.37040: stdout chunk (state=3): >>>/root <<< 11792 1727096126.37137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096126.37166: stderr chunk (state=3): >>><<< 11792 1727096126.37172: stdout chunk (state=3): >>><<< 11792 1727096126.37193: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096126.37205: _low_level_execute_command(): starting 11792 1727096126.37210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414 `" && echo ansible-tmp-1727096126.3719301-12237-13665618560414="` echo /root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414 `" ) && sleep 0' 11792 1727096126.37660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096126.37674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096126.37676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.37679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096126.37681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.37723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096126.37726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096126.37728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096126.37765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096126.39693: stdout chunk (state=3): >>>ansible-tmp-1727096126.3719301-12237-13665618560414=/root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414 <<< 11792 1727096126.39798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096126.39828: stderr chunk (state=3): >>><<< 11792 1727096126.39831: stdout chunk (state=3): >>><<< 11792 1727096126.39849: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096126.3719301-12237-13665618560414=/root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096126.39878: variable 'ansible_module_compression' from source: unknown 11792 1727096126.39922: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096126.39953: variable 'ansible_facts' from source: unknown 11792 1727096126.39999: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/AnsiballZ_command.py 11792 1727096126.40100: Sending initial data 11792 1727096126.40104: Sent initial data (155 bytes) 11792 1727096126.40545: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096126.40549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096126.40554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.40556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096126.40558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.40608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096126.40611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096126.40649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096126.42240: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11792 1727096126.42246: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096126.42262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096126.42293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpb4q34_pj /root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/AnsiballZ_command.py <<< 11792 1727096126.42304: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/AnsiballZ_command.py" <<< 11792 1727096126.42324: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpb4q34_pj" to remote "/root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/AnsiballZ_command.py" <<< 11792 1727096126.42328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/AnsiballZ_command.py" <<< 11792 1727096126.42815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096126.42858: stderr chunk (state=3): >>><<< 11792 1727096126.42861: stdout chunk (state=3): >>><<< 11792 1727096126.42910: done transferring module to remote 11792 1727096126.42919: _low_level_execute_command(): starting 11792 1727096126.42923: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/ /root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/AnsiballZ_command.py && sleep 0' 11792 1727096126.43369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096126.43373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096126.43379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096126.43382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.43424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096126.43428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096126.43466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096126.45229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096126.45257: stderr chunk (state=3): >>><<< 11792 1727096126.45260: stdout chunk (state=3): >>><<< 11792 1727096126.45273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096126.45276: _low_level_execute_command(): starting 11792 1727096126.45281: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/AnsiballZ_command.py && sleep 0' 11792 1727096126.45707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096126.45711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.45721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096126.45774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096126.45789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096126.45824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096127.84457: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6933 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6933 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.co<<< 11792 1727096127.84464: stdout chunk (state=3): >>>m/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-23 08:55:26.611486", "end": "2024-09-23 08:55:27.842531", "delta": "0:00:01.231045", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096127.86177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096127.86181: stdout chunk (state=3): >>><<< 11792 1727096127.86184: stderr chunk (state=3): >>><<< 11792 1727096127.86348: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6933 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6933 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-23 08:55:26.611486", "end": "2024-09-23 08:55:27.842531", "delta": "0:00:01.231045", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096127.86357: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096127.86360: _low_level_execute_command(): starting 11792 1727096127.86363: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096126.3719301-12237-13665618560414/ > /dev/null 2>&1 && sleep 0' 11792 1727096127.86975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096127.86989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096127.87007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096127.87040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096127.87058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096127.87145: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096127.87177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096127.87200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096127.87218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096127.87299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096127.89283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096127.89286: stdout chunk (state=3): >>><<< 11792 1727096127.89289: stderr chunk (state=3): >>><<< 11792 1727096127.89306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096127.89318: handler run complete 11792 1727096127.89352: Evaluated conditional (False): False 11792 1727096127.89479: attempt loop complete, returning result 11792 1727096127.89482: _execute() done 11792 1727096127.89484: dumping result to json 11792 1727096127.89486: done dumping result, returning 11792 1727096127.89489: done running TaskExecutor() for managed_node2/TASK: Create test interfaces [0afff68d-5257-d9c7-3fc0-000000000115] 11792 1727096127.89491: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000115 11792 1727096127.89572: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000115 11792 1727096127.89575: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.231045", "end": "2024-09-23 08:55:27.842531", "rc": 0, "start": "2024-09-23 08:55:26.611486" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 6933 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 6933 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11792 1727096127.89780: no more pending results, returning what we have 11792 1727096127.89784: results queue empty 11792 1727096127.89784: checking for any_errors_fatal 11792 1727096127.89792: done checking for any_errors_fatal 11792 1727096127.89793: checking for max_fail_percentage 11792 1727096127.89800: done checking for max_fail_percentage 11792 1727096127.89801: checking to see if all hosts have failed and the running result is not ok 11792 1727096127.89802: done checking to see if all hosts have failed 11792 1727096127.89802: getting the remaining hosts for this loop 11792 1727096127.89805: done getting the remaining hosts for this loop 11792 1727096127.89809: getting the next task for host managed_node2 11792 1727096127.89819: done getting next task for host managed_node2 11792 1727096127.89822: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11792 1727096127.89828: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096127.89832: getting variables 11792 1727096127.89833: in VariableManager get_vars() 11792 1727096127.89974: Calling all_inventory to load vars for managed_node2 11792 1727096127.89977: Calling groups_inventory to load vars for managed_node2 11792 1727096127.89982: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096127.89994: Calling all_plugins_play to load vars for managed_node2 11792 1727096127.89997: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096127.90000: Calling groups_plugins_play to load vars for managed_node2 11792 1727096127.90362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096127.90562: done with get_vars() 11792 1727096127.90579: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:27 -0400 (0:00:01.575) 0:00:10.186 ****** 11792 1727096127.90689: entering _queue_task() for managed_node2/include_tasks 11792 1727096127.91043: worker is 1 (out of 1 available) 11792 1727096127.91057: exiting _queue_task() for managed_node2/include_tasks 11792 1727096127.91171: done queuing things up, now waiting for results queue to drain 11792 1727096127.91173: waiting for pending results... 11792 1727096127.91433: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11792 1727096127.91648: in run() - task 0afff68d-5257-d9c7-3fc0-00000000011c 11792 1727096127.91872: variable 'ansible_search_path' from source: unknown 11792 1727096127.91876: variable 'ansible_search_path' from source: unknown 11792 1727096127.91879: calling self._execute() 11792 1727096127.91881: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096127.91884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096127.91886: variable 'omit' from source: magic vars 11792 1727096127.92135: variable 'ansible_distribution_major_version' from source: facts 11792 1727096127.92151: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096127.92162: _execute() done 11792 1727096127.92171: dumping result to json 11792 1727096127.92178: done dumping result, returning 11792 1727096127.92187: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-d9c7-3fc0-00000000011c] 11792 1727096127.92194: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000011c 11792 1727096127.92377: no more pending results, returning what we have 11792 1727096127.92382: in VariableManager get_vars() 11792 1727096127.92414: Calling all_inventory to load vars for managed_node2 11792 1727096127.92416: Calling groups_inventory to load vars for managed_node2 11792 1727096127.92529: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096127.92537: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000011c 11792 1727096127.92540: WORKER PROCESS EXITING 11792 1727096127.92549: Calling all_plugins_play to load vars for managed_node2 11792 1727096127.92551: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096127.92554: Calling groups_plugins_play to load vars for managed_node2 11792 1727096127.92725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096127.92914: done with get_vars() 11792 1727096127.92921: variable 'ansible_search_path' from source: unknown 11792 1727096127.92922: variable 'ansible_search_path' from source: unknown 11792 1727096127.92955: we have included files to process 11792 1727096127.92956: generating all_blocks data 11792 1727096127.92958: done generating all_blocks data 11792 1727096127.92963: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096127.92964: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096127.92966: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096127.93189: done processing included file 11792 1727096127.93191: iterating over new_blocks loaded from include file 11792 1727096127.93193: in VariableManager get_vars() 11792 1727096127.93207: done with get_vars() 11792 1727096127.93209: filtering new block on tags 11792 1727096127.93241: done filtering new block on tags 11792 1727096127.93243: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11792 1727096127.93248: extending task lists for all hosts with included blocks 11792 1727096127.93475: done extending task lists 11792 1727096127.93476: done processing included files 11792 1727096127.93477: results queue empty 11792 1727096127.93478: checking for any_errors_fatal 11792 1727096127.93481: done checking for any_errors_fatal 11792 1727096127.93482: checking for max_fail_percentage 11792 1727096127.93483: done checking for max_fail_percentage 11792 1727096127.93484: checking to see if all hosts have failed and the running result is not ok 11792 1727096127.93485: done checking to see if all hosts have failed 11792 1727096127.93485: getting the remaining hosts for this loop 11792 1727096127.93487: done getting the remaining hosts for this loop 11792 1727096127.93489: getting the next task for host managed_node2 11792 1727096127.93494: done getting next task for host managed_node2 11792 1727096127.93496: ^ task is: TASK: Get stat for interface {{ interface }} 11792 1727096127.93500: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096127.93502: getting variables 11792 1727096127.93503: in VariableManager get_vars() 11792 1727096127.93510: Calling all_inventory to load vars for managed_node2 11792 1727096127.93512: Calling groups_inventory to load vars for managed_node2 11792 1727096127.93514: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096127.93519: Calling all_plugins_play to load vars for managed_node2 11792 1727096127.93521: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096127.93524: Calling groups_plugins_play to load vars for managed_node2 11792 1727096127.93655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096127.93846: done with get_vars() 11792 1727096127.93855: done getting variables 11792 1727096127.94000: variable 'interface' from source: task vars 11792 1727096127.94004: variable 'dhcp_interface1' from source: play vars 11792 1727096127.94064: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:55:27 -0400 (0:00:00.034) 0:00:10.220 ****** 11792 1727096127.94098: entering _queue_task() for managed_node2/stat 11792 1727096127.94327: worker is 1 (out of 1 available) 11792 1727096127.94339: exiting _queue_task() for managed_node2/stat 11792 1727096127.94351: done queuing things up, now waiting for results queue to drain 11792 1727096127.94352: waiting for pending results... 11792 1727096127.94601: running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 11792 1727096127.94723: in run() - task 0afff68d-5257-d9c7-3fc0-00000000017b 11792 1727096127.94742: variable 'ansible_search_path' from source: unknown 11792 1727096127.94748: variable 'ansible_search_path' from source: unknown 11792 1727096127.94787: calling self._execute() 11792 1727096127.94857: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096127.94870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096127.94883: variable 'omit' from source: magic vars 11792 1727096127.95253: variable 'ansible_distribution_major_version' from source: facts 11792 1727096127.95273: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096127.95372: variable 'omit' from source: magic vars 11792 1727096127.95375: variable 'omit' from source: magic vars 11792 1727096127.95453: variable 'interface' from source: task vars 11792 1727096127.95463: variable 'dhcp_interface1' from source: play vars 11792 1727096127.95533: variable 'dhcp_interface1' from source: play vars 11792 1727096127.95559: variable 'omit' from source: magic vars 11792 1727096127.95608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096127.95650: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096127.95678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096127.95701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096127.95723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096127.95758: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096127.95820: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096127.95824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096127.95887: Set connection var ansible_timeout to 10 11792 1727096127.95905: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096127.95920: Set connection var ansible_shell_executable to /bin/sh 11792 1727096127.95938: Set connection var ansible_pipelining to False 11792 1727096127.95945: Set connection var ansible_shell_type to sh 11792 1727096127.95953: Set connection var ansible_connection to ssh 11792 1727096127.95981: variable 'ansible_shell_executable' from source: unknown 11792 1727096127.96040: variable 'ansible_connection' from source: unknown 11792 1727096127.96043: variable 'ansible_module_compression' from source: unknown 11792 1727096127.96045: variable 'ansible_shell_type' from source: unknown 11792 1727096127.96047: variable 'ansible_shell_executable' from source: unknown 11792 1727096127.96049: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096127.96051: variable 'ansible_pipelining' from source: unknown 11792 1727096127.96054: variable 'ansible_timeout' from source: unknown 11792 1727096127.96056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096127.96228: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096127.96245: variable 'omit' from source: magic vars 11792 1727096127.96261: starting attempt loop 11792 1727096127.96270: running the handler 11792 1727096127.96289: _low_level_execute_command(): starting 11792 1727096127.96368: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096127.97090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096127.97153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096127.97181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096127.97253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096127.98950: stdout chunk (state=3): >>>/root <<< 11792 1727096127.99089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096127.99117: stdout chunk (state=3): >>><<< 11792 1727096127.99144: stderr chunk (state=3): >>><<< 11792 1727096127.99164: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096127.99279: _low_level_execute_command(): starting 11792 1727096127.99284: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580 `" && echo ansible-tmp-1727096127.9917448-12274-262823623081580="` echo /root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580 `" ) && sleep 0' 11792 1727096127.99916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096127.99945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096127.99963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096128.00052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096128.00132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096128.00178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096128.00181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.00247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.02211: stdout chunk (state=3): >>>ansible-tmp-1727096127.9917448-12274-262823623081580=/root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580 <<< 11792 1727096128.02381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.02385: stdout chunk (state=3): >>><<< 11792 1727096128.02387: stderr chunk (state=3): >>><<< 11792 1727096128.02573: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096127.9917448-12274-262823623081580=/root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096128.02576: variable 'ansible_module_compression' from source: unknown 11792 1727096128.02579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11792 1727096128.02581: variable 'ansible_facts' from source: unknown 11792 1727096128.02663: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/AnsiballZ_stat.py 11792 1727096128.02823: Sending initial data 11792 1727096128.02832: Sent initial data (153 bytes) 11792 1727096128.03462: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096128.03477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096128.03491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096128.03506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096128.03595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096128.03612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096128.03626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.03692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.05348: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096128.05393: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096128.05427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096128.05489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpnl5f16b8 /root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/AnsiballZ_stat.py <<< 11792 1727096128.05499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/AnsiballZ_stat.py" <<< 11792 1727096128.05535: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpnl5f16b8" to remote "/root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/AnsiballZ_stat.py" <<< 11792 1727096128.06276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.06280: stdout chunk (state=3): >>><<< 11792 1727096128.06282: stderr chunk (state=3): >>><<< 11792 1727096128.06343: done transferring module to remote 11792 1727096128.06363: _low_level_execute_command(): starting 11792 1727096128.06462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/ /root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/AnsiballZ_stat.py && sleep 0' 11792 1727096128.07111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096128.07133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096128.07163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096128.07202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096128.07220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096128.07265: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096128.07450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096128.07720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096128.07730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.07763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.09645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.09710: stderr chunk (state=3): >>><<< 11792 1727096128.09724: stdout chunk (state=3): >>><<< 11792 1727096128.09748: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096128.09760: _low_level_execute_command(): starting 11792 1727096128.09774: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/AnsiballZ_stat.py && sleep 0' 11792 1727096128.10461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096128.10582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096128.10673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.10680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.26510: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26376, "dev": 23, "nlink": 1, "atime": 1727096126.6182241, "mtime": 1727096126.6182241, "ctime": 1727096126.6182241, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11792 1727096128.27980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096128.27984: stdout chunk (state=3): >>><<< 11792 1727096128.27986: stderr chunk (state=3): >>><<< 11792 1727096128.28005: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26376, "dev": 23, "nlink": 1, "atime": 1727096126.6182241, "mtime": 1727096126.6182241, "ctime": 1727096126.6182241, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096128.28062: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096128.28090: _low_level_execute_command(): starting 11792 1727096128.28110: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096127.9917448-12274-262823623081580/ > /dev/null 2>&1 && sleep 0' 11792 1727096128.29364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096128.29417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096128.29434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096128.29459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.29539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.31519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.31523: stdout chunk (state=3): >>><<< 11792 1727096128.31735: stderr chunk (state=3): >>><<< 11792 1727096128.31739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096128.31745: handler run complete 11792 1727096128.31748: attempt loop complete, returning result 11792 1727096128.31752: _execute() done 11792 1727096128.31754: dumping result to json 11792 1727096128.31756: done dumping result, returning 11792 1727096128.31758: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 [0afff68d-5257-d9c7-3fc0-00000000017b] 11792 1727096128.31760: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000017b ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096126.6182241, "block_size": 4096, "blocks": 0, "ctime": 1727096126.6182241, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26376, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727096126.6182241, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11792 1727096128.32470: no more pending results, returning what we have 11792 1727096128.32475: results queue empty 11792 1727096128.32476: checking for any_errors_fatal 11792 1727096128.32477: done checking for any_errors_fatal 11792 1727096128.32478: checking for max_fail_percentage 11792 1727096128.32480: done checking for max_fail_percentage 11792 1727096128.32481: checking to see if all hosts have failed and the running result is not ok 11792 1727096128.32482: done checking to see if all hosts have failed 11792 1727096128.32483: getting the remaining hosts for this loop 11792 1727096128.32485: done getting the remaining hosts for this loop 11792 1727096128.32489: getting the next task for host managed_node2 11792 1727096128.32499: done getting next task for host managed_node2 11792 1727096128.32501: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11792 1727096128.32506: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096128.32510: getting variables 11792 1727096128.32512: in VariableManager get_vars() 11792 1727096128.32545: Calling all_inventory to load vars for managed_node2 11792 1727096128.32548: Calling groups_inventory to load vars for managed_node2 11792 1727096128.32555: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096128.32744: Calling all_plugins_play to load vars for managed_node2 11792 1727096128.32748: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096128.32757: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000017b 11792 1727096128.32760: WORKER PROCESS EXITING 11792 1727096128.32763: Calling groups_plugins_play to load vars for managed_node2 11792 1727096128.33672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096128.34174: done with get_vars() 11792 1727096128.34187: done getting variables 11792 1727096128.34419: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11792 1727096128.34658: variable 'interface' from source: task vars 11792 1727096128.34661: variable 'dhcp_interface1' from source: play vars 11792 1727096128.34833: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:28 -0400 (0:00:00.407) 0:00:10.627 ****** 11792 1727096128.34871: entering _queue_task() for managed_node2/assert 11792 1727096128.34873: Creating lock for assert 11792 1727096128.35882: worker is 1 (out of 1 available) 11792 1727096128.35892: exiting _queue_task() for managed_node2/assert 11792 1727096128.35904: done queuing things up, now waiting for results queue to drain 11792 1727096128.35905: waiting for pending results... 11792 1727096128.36236: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' 11792 1727096128.36358: in run() - task 0afff68d-5257-d9c7-3fc0-00000000011d 11792 1727096128.36774: variable 'ansible_search_path' from source: unknown 11792 1727096128.36777: variable 'ansible_search_path' from source: unknown 11792 1727096128.36780: calling self._execute() 11792 1727096128.36782: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.36785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.36792: variable 'omit' from source: magic vars 11792 1727096128.37354: variable 'ansible_distribution_major_version' from source: facts 11792 1727096128.37376: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096128.37389: variable 'omit' from source: magic vars 11792 1727096128.37476: variable 'omit' from source: magic vars 11792 1727096128.37582: variable 'interface' from source: task vars 11792 1727096128.37591: variable 'dhcp_interface1' from source: play vars 11792 1727096128.37655: variable 'dhcp_interface1' from source: play vars 11792 1727096128.37689: variable 'omit' from source: magic vars 11792 1727096128.37733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096128.37780: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096128.37806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096128.37830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096128.37846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096128.37891: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096128.37904: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.37913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.38032: Set connection var ansible_timeout to 10 11792 1727096128.38048: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096128.38065: Set connection var ansible_shell_executable to /bin/sh 11792 1727096128.38078: Set connection var ansible_pipelining to False 11792 1727096128.38084: Set connection var ansible_shell_type to sh 11792 1727096128.38089: Set connection var ansible_connection to ssh 11792 1727096128.38121: variable 'ansible_shell_executable' from source: unknown 11792 1727096128.38129: variable 'ansible_connection' from source: unknown 11792 1727096128.38135: variable 'ansible_module_compression' from source: unknown 11792 1727096128.38141: variable 'ansible_shell_type' from source: unknown 11792 1727096128.38147: variable 'ansible_shell_executable' from source: unknown 11792 1727096128.38156: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.38162: variable 'ansible_pipelining' from source: unknown 11792 1727096128.38171: variable 'ansible_timeout' from source: unknown 11792 1727096128.38178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.38324: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096128.38339: variable 'omit' from source: magic vars 11792 1727096128.38348: starting attempt loop 11792 1727096128.38358: running the handler 11792 1727096128.38501: variable 'interface_stat' from source: set_fact 11792 1727096128.38524: Evaluated conditional (interface_stat.stat.exists): True 11792 1727096128.38542: handler run complete 11792 1727096128.38564: attempt loop complete, returning result 11792 1727096128.38573: _execute() done 11792 1727096128.38579: dumping result to json 11792 1727096128.38585: done dumping result, returning 11792 1727096128.38596: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' [0afff68d-5257-d9c7-3fc0-00000000011d] 11792 1727096128.38603: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000011d ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096128.38802: no more pending results, returning what we have 11792 1727096128.38806: results queue empty 11792 1727096128.38807: checking for any_errors_fatal 11792 1727096128.38818: done checking for any_errors_fatal 11792 1727096128.38819: checking for max_fail_percentage 11792 1727096128.38820: done checking for max_fail_percentage 11792 1727096128.38821: checking to see if all hosts have failed and the running result is not ok 11792 1727096128.38822: done checking to see if all hosts have failed 11792 1727096128.38823: getting the remaining hosts for this loop 11792 1727096128.38825: done getting the remaining hosts for this loop 11792 1727096128.38828: getting the next task for host managed_node2 11792 1727096128.38837: done getting next task for host managed_node2 11792 1727096128.38839: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11792 1727096128.38846: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096128.38852: getting variables 11792 1727096128.38854: in VariableManager get_vars() 11792 1727096128.38893: Calling all_inventory to load vars for managed_node2 11792 1727096128.38896: Calling groups_inventory to load vars for managed_node2 11792 1727096128.38900: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096128.38914: Calling all_plugins_play to load vars for managed_node2 11792 1727096128.38918: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096128.38922: Calling groups_plugins_play to load vars for managed_node2 11792 1727096128.39623: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000011d 11792 1727096128.39627: WORKER PROCESS EXITING 11792 1727096128.39661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096128.40210: done with get_vars() 11792 1727096128.40225: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:28 -0400 (0:00:00.055) 0:00:10.683 ****** 11792 1727096128.40449: entering _queue_task() for managed_node2/include_tasks 11792 1727096128.41119: worker is 1 (out of 1 available) 11792 1727096128.41132: exiting _queue_task() for managed_node2/include_tasks 11792 1727096128.41146: done queuing things up, now waiting for results queue to drain 11792 1727096128.41148: waiting for pending results... 11792 1727096128.41633: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11792 1727096128.41943: in run() - task 0afff68d-5257-d9c7-3fc0-000000000121 11792 1727096128.41956: variable 'ansible_search_path' from source: unknown 11792 1727096128.41959: variable 'ansible_search_path' from source: unknown 11792 1727096128.42085: calling self._execute() 11792 1727096128.42159: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.42166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.42177: variable 'omit' from source: magic vars 11792 1727096128.43002: variable 'ansible_distribution_major_version' from source: facts 11792 1727096128.43013: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096128.43017: _execute() done 11792 1727096128.43072: dumping result to json 11792 1727096128.43075: done dumping result, returning 11792 1727096128.43118: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-d9c7-3fc0-000000000121] 11792 1727096128.43122: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000121 11792 1727096128.43198: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000121 11792 1727096128.43201: WORKER PROCESS EXITING 11792 1727096128.43236: no more pending results, returning what we have 11792 1727096128.43242: in VariableManager get_vars() 11792 1727096128.43286: Calling all_inventory to load vars for managed_node2 11792 1727096128.43289: Calling groups_inventory to load vars for managed_node2 11792 1727096128.43292: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096128.43306: Calling all_plugins_play to load vars for managed_node2 11792 1727096128.43309: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096128.43311: Calling groups_plugins_play to load vars for managed_node2 11792 1727096128.43857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096128.44369: done with get_vars() 11792 1727096128.44377: variable 'ansible_search_path' from source: unknown 11792 1727096128.44378: variable 'ansible_search_path' from source: unknown 11792 1727096128.44410: we have included files to process 11792 1727096128.44411: generating all_blocks data 11792 1727096128.44412: done generating all_blocks data 11792 1727096128.44415: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096128.44416: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096128.44418: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096128.44831: done processing included file 11792 1727096128.44833: iterating over new_blocks loaded from include file 11792 1727096128.44834: in VariableManager get_vars() 11792 1727096128.44852: done with get_vars() 11792 1727096128.44854: filtering new block on tags 11792 1727096128.44983: done filtering new block on tags 11792 1727096128.44991: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11792 1727096128.44996: extending task lists for all hosts with included blocks 11792 1727096128.45411: done extending task lists 11792 1727096128.45413: done processing included files 11792 1727096128.45413: results queue empty 11792 1727096128.45414: checking for any_errors_fatal 11792 1727096128.45417: done checking for any_errors_fatal 11792 1727096128.45532: checking for max_fail_percentage 11792 1727096128.45534: done checking for max_fail_percentage 11792 1727096128.45535: checking to see if all hosts have failed and the running result is not ok 11792 1727096128.45536: done checking to see if all hosts have failed 11792 1727096128.45537: getting the remaining hosts for this loop 11792 1727096128.45538: done getting the remaining hosts for this loop 11792 1727096128.45541: getting the next task for host managed_node2 11792 1727096128.45546: done getting next task for host managed_node2 11792 1727096128.45548: ^ task is: TASK: Get stat for interface {{ interface }} 11792 1727096128.45555: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096128.45558: getting variables 11792 1727096128.45559: in VariableManager get_vars() 11792 1727096128.45570: Calling all_inventory to load vars for managed_node2 11792 1727096128.45572: Calling groups_inventory to load vars for managed_node2 11792 1727096128.45574: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096128.45580: Calling all_plugins_play to load vars for managed_node2 11792 1727096128.45583: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096128.45585: Calling groups_plugins_play to load vars for managed_node2 11792 1727096128.45844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096128.46395: done with get_vars() 11792 1727096128.46406: done getting variables 11792 1727096128.46643: variable 'interface' from source: task vars 11792 1727096128.46647: variable 'dhcp_interface2' from source: play vars 11792 1727096128.46877: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:55:28 -0400 (0:00:00.064) 0:00:10.748 ****** 11792 1727096128.46913: entering _queue_task() for managed_node2/stat 11792 1727096128.47679: worker is 1 (out of 1 available) 11792 1727096128.47698: exiting _queue_task() for managed_node2/stat 11792 1727096128.47712: done queuing things up, now waiting for results queue to drain 11792 1727096128.47714: waiting for pending results... 11792 1727096128.48252: running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 11792 1727096128.48317: in run() - task 0afff68d-5257-d9c7-3fc0-00000000019f 11792 1727096128.48329: variable 'ansible_search_path' from source: unknown 11792 1727096128.48333: variable 'ansible_search_path' from source: unknown 11792 1727096128.48369: calling self._execute() 11792 1727096128.48556: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.48561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.48574: variable 'omit' from source: magic vars 11792 1727096128.49629: variable 'ansible_distribution_major_version' from source: facts 11792 1727096128.49633: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096128.49635: variable 'omit' from source: magic vars 11792 1727096128.49844: variable 'omit' from source: magic vars 11792 1727096128.49894: variable 'interface' from source: task vars 11792 1727096128.49960: variable 'dhcp_interface2' from source: play vars 11792 1727096128.50021: variable 'dhcp_interface2' from source: play vars 11792 1727096128.50108: variable 'omit' from source: magic vars 11792 1727096128.50215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096128.50255: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096128.50410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096128.50587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096128.50606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096128.50661: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096128.50774: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.50777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.50981: Set connection var ansible_timeout to 10 11792 1727096128.50996: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096128.51012: Set connection var ansible_shell_executable to /bin/sh 11792 1727096128.51022: Set connection var ansible_pipelining to False 11792 1727096128.51031: Set connection var ansible_shell_type to sh 11792 1727096128.51072: Set connection var ansible_connection to ssh 11792 1727096128.51101: variable 'ansible_shell_executable' from source: unknown 11792 1727096128.51272: variable 'ansible_connection' from source: unknown 11792 1727096128.51276: variable 'ansible_module_compression' from source: unknown 11792 1727096128.51279: variable 'ansible_shell_type' from source: unknown 11792 1727096128.51282: variable 'ansible_shell_executable' from source: unknown 11792 1727096128.51284: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.51286: variable 'ansible_pipelining' from source: unknown 11792 1727096128.51288: variable 'ansible_timeout' from source: unknown 11792 1727096128.51290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.51621: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096128.51625: variable 'omit' from source: magic vars 11792 1727096128.51627: starting attempt loop 11792 1727096128.51630: running the handler 11792 1727096128.51644: _low_level_execute_command(): starting 11792 1727096128.51738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096128.53330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096128.53397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096128.53474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.53486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.55195: stdout chunk (state=3): >>>/root <<< 11792 1727096128.55296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.55348: stderr chunk (state=3): >>><<< 11792 1727096128.55362: stdout chunk (state=3): >>><<< 11792 1727096128.55594: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096128.55598: _low_level_execute_command(): starting 11792 1727096128.55600: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416 `" && echo ansible-tmp-1727096128.5550973-12299-57738039263416="` echo /root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416 `" ) && sleep 0' 11792 1727096128.56863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096128.56880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096128.56966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096128.57019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096128.57129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096128.57173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.59198: stdout chunk (state=3): >>>ansible-tmp-1727096128.5550973-12299-57738039263416=/root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416 <<< 11792 1727096128.59298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.59375: stderr chunk (state=3): >>><<< 11792 1727096128.59378: stdout chunk (state=3): >>><<< 11792 1727096128.59396: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096128.5550973-12299-57738039263416=/root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096128.59591: variable 'ansible_module_compression' from source: unknown 11792 1727096128.59700: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11792 1727096128.59722: variable 'ansible_facts' from source: unknown 11792 1727096128.59941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/AnsiballZ_stat.py 11792 1727096128.60307: Sending initial data 11792 1727096128.60316: Sent initial data (152 bytes) 11792 1727096128.61413: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096128.61431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096128.61561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096128.61700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096128.61703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096128.61727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.61799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.63531: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096128.63589: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096128.63649: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp866h6n3u /root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/AnsiballZ_stat.py <<< 11792 1727096128.63672: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/AnsiballZ_stat.py" <<< 11792 1727096128.63700: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp866h6n3u" to remote "/root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/AnsiballZ_stat.py" <<< 11792 1727096128.65462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.65473: stdout chunk (state=3): >>><<< 11792 1727096128.65477: stderr chunk (state=3): >>><<< 11792 1727096128.65479: done transferring module to remote 11792 1727096128.65481: _low_level_execute_command(): starting 11792 1727096128.65483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/ /root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/AnsiballZ_stat.py && sleep 0' 11792 1727096128.66611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096128.66871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096128.66987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.67060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.69031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.69035: stdout chunk (state=3): >>><<< 11792 1727096128.69037: stderr chunk (state=3): >>><<< 11792 1727096128.69147: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096128.69155: _low_level_execute_command(): starting 11792 1727096128.69160: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/AnsiballZ_stat.py && sleep 0' 11792 1727096128.70294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096128.70469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096128.70753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.70771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.86649: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26782, "dev": 23, "nlink": 1, "atime": 1727096126.624452, "mtime": 1727096126.624452, "ctime": 1727096126.624452, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11792 1727096128.88147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096128.88151: stdout chunk (state=3): >>><<< 11792 1727096128.88535: stderr chunk (state=3): >>><<< 11792 1727096128.88540: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26782, "dev": 23, "nlink": 1, "atime": 1727096126.624452, "mtime": 1727096126.624452, "ctime": 1727096126.624452, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096128.88542: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096128.88544: _low_level_execute_command(): starting 11792 1727096128.88546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096128.5550973-12299-57738039263416/ > /dev/null 2>&1 && sleep 0' 11792 1727096128.89638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096128.89835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096128.90005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096128.90031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096128.91988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096128.92001: stdout chunk (state=3): >>><<< 11792 1727096128.92037: stderr chunk (state=3): >>><<< 11792 1727096128.92243: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096128.92246: handler run complete 11792 1727096128.92249: attempt loop complete, returning result 11792 1727096128.92251: _execute() done 11792 1727096128.92253: dumping result to json 11792 1727096128.92255: done dumping result, returning 11792 1727096128.92257: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 [0afff68d-5257-d9c7-3fc0-00000000019f] 11792 1727096128.92259: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000019f 11792 1727096128.92540: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000019f 11792 1727096128.92543: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096126.624452, "block_size": 4096, "blocks": 0, "ctime": 1727096126.624452, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26782, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727096126.624452, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11792 1727096128.92734: no more pending results, returning what we have 11792 1727096128.92737: results queue empty 11792 1727096128.92738: checking for any_errors_fatal 11792 1727096128.92740: done checking for any_errors_fatal 11792 1727096128.92740: checking for max_fail_percentage 11792 1727096128.92742: done checking for max_fail_percentage 11792 1727096128.92743: checking to see if all hosts have failed and the running result is not ok 11792 1727096128.92744: done checking to see if all hosts have failed 11792 1727096128.92744: getting the remaining hosts for this loop 11792 1727096128.92746: done getting the remaining hosts for this loop 11792 1727096128.92750: getting the next task for host managed_node2 11792 1727096128.92758: done getting next task for host managed_node2 11792 1727096128.92761: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11792 1727096128.92765: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096128.93072: getting variables 11792 1727096128.93074: in VariableManager get_vars() 11792 1727096128.93100: Calling all_inventory to load vars for managed_node2 11792 1727096128.93103: Calling groups_inventory to load vars for managed_node2 11792 1727096128.93107: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096128.93119: Calling all_plugins_play to load vars for managed_node2 11792 1727096128.93122: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096128.93125: Calling groups_plugins_play to load vars for managed_node2 11792 1727096128.93688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096128.94087: done with get_vars() 11792 1727096128.94099: done getting variables 11792 1727096128.94152: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096128.94263: variable 'interface' from source: task vars 11792 1727096128.94469: variable 'dhcp_interface2' from source: play vars 11792 1727096128.94530: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:28 -0400 (0:00:00.476) 0:00:11.224 ****** 11792 1727096128.94563: entering _queue_task() for managed_node2/assert 11792 1727096128.95304: worker is 1 (out of 1 available) 11792 1727096128.95313: exiting _queue_task() for managed_node2/assert 11792 1727096128.95325: done queuing things up, now waiting for results queue to drain 11792 1727096128.95326: waiting for pending results... 11792 1727096128.95786: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' 11792 1727096128.95889: in run() - task 0afff68d-5257-d9c7-3fc0-000000000122 11792 1727096128.95909: variable 'ansible_search_path' from source: unknown 11792 1727096128.95923: variable 'ansible_search_path' from source: unknown 11792 1727096128.95970: calling self._execute() 11792 1727096128.96246: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.96263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.96279: variable 'omit' from source: magic vars 11792 1727096128.97021: variable 'ansible_distribution_major_version' from source: facts 11792 1727096128.97086: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096128.97098: variable 'omit' from source: magic vars 11792 1727096128.97171: variable 'omit' from source: magic vars 11792 1727096128.97426: variable 'interface' from source: task vars 11792 1727096128.97442: variable 'dhcp_interface2' from source: play vars 11792 1727096128.97572: variable 'dhcp_interface2' from source: play vars 11792 1727096128.97576: variable 'omit' from source: magic vars 11792 1727096128.97690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096128.97797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096128.97893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096128.97918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096128.97936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096128.98172: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096128.98176: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.98179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.98310: Set connection var ansible_timeout to 10 11792 1727096128.98326: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096128.98341: Set connection var ansible_shell_executable to /bin/sh 11792 1727096128.98354: Set connection var ansible_pipelining to False 11792 1727096128.98362: Set connection var ansible_shell_type to sh 11792 1727096128.98371: Set connection var ansible_connection to ssh 11792 1727096128.98398: variable 'ansible_shell_executable' from source: unknown 11792 1727096128.98626: variable 'ansible_connection' from source: unknown 11792 1727096128.98630: variable 'ansible_module_compression' from source: unknown 11792 1727096128.98632: variable 'ansible_shell_type' from source: unknown 11792 1727096128.98634: variable 'ansible_shell_executable' from source: unknown 11792 1727096128.98636: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096128.98638: variable 'ansible_pipelining' from source: unknown 11792 1727096128.98640: variable 'ansible_timeout' from source: unknown 11792 1727096128.98643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096128.98794: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096128.98811: variable 'omit' from source: magic vars 11792 1727096128.98853: starting attempt loop 11792 1727096128.98864: running the handler 11792 1727096128.99274: variable 'interface_stat' from source: set_fact 11792 1727096128.99278: Evaluated conditional (interface_stat.stat.exists): True 11792 1727096128.99279: handler run complete 11792 1727096128.99281: attempt loop complete, returning result 11792 1727096128.99283: _execute() done 11792 1727096128.99284: dumping result to json 11792 1727096128.99286: done dumping result, returning 11792 1727096128.99297: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' [0afff68d-5257-d9c7-3fc0-000000000122] 11792 1727096128.99305: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000122 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096128.99457: no more pending results, returning what we have 11792 1727096128.99461: results queue empty 11792 1727096128.99462: checking for any_errors_fatal 11792 1727096128.99473: done checking for any_errors_fatal 11792 1727096128.99474: checking for max_fail_percentage 11792 1727096128.99476: done checking for max_fail_percentage 11792 1727096128.99478: checking to see if all hosts have failed and the running result is not ok 11792 1727096128.99478: done checking to see if all hosts have failed 11792 1727096128.99479: getting the remaining hosts for this loop 11792 1727096128.99481: done getting the remaining hosts for this loop 11792 1727096128.99484: getting the next task for host managed_node2 11792 1727096128.99494: done getting next task for host managed_node2 11792 1727096128.99497: ^ task is: TASK: Test 11792 1727096128.99501: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096128.99506: getting variables 11792 1727096128.99508: in VariableManager get_vars() 11792 1727096128.99538: Calling all_inventory to load vars for managed_node2 11792 1727096128.99541: Calling groups_inventory to load vars for managed_node2 11792 1727096128.99544: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096128.99555: Calling all_plugins_play to load vars for managed_node2 11792 1727096128.99558: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096128.99560: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.00092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.00329: done with get_vars() 11792 1727096129.00339: done getting variables 11792 1727096129.00574: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000122 11792 1727096129.00577: WORKER PROCESS EXITING TASK [Test] ******************************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Monday 23 September 2024 08:55:29 -0400 (0:00:00.060) 0:00:11.285 ****** 11792 1727096129.00647: entering _queue_task() for managed_node2/include_tasks 11792 1727096129.01129: worker is 1 (out of 1 available) 11792 1727096129.01143: exiting _queue_task() for managed_node2/include_tasks 11792 1727096129.01156: done queuing things up, now waiting for results queue to drain 11792 1727096129.01158: waiting for pending results... 11792 1727096129.01586: running TaskExecutor() for managed_node2/TASK: Test 11792 1727096129.01845: in run() - task 0afff68d-5257-d9c7-3fc0-00000000008c 11792 1727096129.01863: variable 'ansible_search_path' from source: unknown 11792 1727096129.01866: variable 'ansible_search_path' from source: unknown 11792 1727096129.01912: variable 'lsr_test' from source: include params 11792 1727096129.02348: variable 'lsr_test' from source: include params 11792 1727096129.02529: variable 'omit' from source: magic vars 11792 1727096129.02780: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.02807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.02873: variable 'omit' from source: magic vars 11792 1727096129.03089: variable 'ansible_distribution_major_version' from source: facts 11792 1727096129.03104: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096129.03114: variable 'item' from source: unknown 11792 1727096129.03193: variable 'item' from source: unknown 11792 1727096129.03239: variable 'item' from source: unknown 11792 1727096129.03306: variable 'item' from source: unknown 11792 1727096129.03573: dumping result to json 11792 1727096129.03576: done dumping result, returning 11792 1727096129.03579: done running TaskExecutor() for managed_node2/TASK: Test [0afff68d-5257-d9c7-3fc0-00000000008c] 11792 1727096129.03582: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008c 11792 1727096129.03625: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008c 11792 1727096129.03628: WORKER PROCESS EXITING 11792 1727096129.03699: no more pending results, returning what we have 11792 1727096129.03706: in VariableManager get_vars() 11792 1727096129.03742: Calling all_inventory to load vars for managed_node2 11792 1727096129.03745: Calling groups_inventory to load vars for managed_node2 11792 1727096129.03749: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.03767: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.03772: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.03776: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.04171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.04387: done with get_vars() 11792 1727096129.04396: variable 'ansible_search_path' from source: unknown 11792 1727096129.04397: variable 'ansible_search_path' from source: unknown 11792 1727096129.04445: we have included files to process 11792 1727096129.04446: generating all_blocks data 11792 1727096129.04448: done generating all_blocks data 11792 1727096129.04455: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11792 1727096129.04456: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11792 1727096129.04459: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11792 1727096129.04962: done processing included file 11792 1727096129.04964: iterating over new_blocks loaded from include file 11792 1727096129.04966: in VariableManager get_vars() 11792 1727096129.04986: done with get_vars() 11792 1727096129.04989: filtering new block on tags 11792 1727096129.05021: done filtering new block on tags 11792 1727096129.05024: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml for managed_node2 => (item=tasks/create_bond_profile.yml) 11792 1727096129.05030: extending task lists for all hosts with included blocks 11792 1727096129.06452: done extending task lists 11792 1727096129.06454: done processing included files 11792 1727096129.06455: results queue empty 11792 1727096129.06456: checking for any_errors_fatal 11792 1727096129.06459: done checking for any_errors_fatal 11792 1727096129.06459: checking for max_fail_percentage 11792 1727096129.06461: done checking for max_fail_percentage 11792 1727096129.06461: checking to see if all hosts have failed and the running result is not ok 11792 1727096129.06462: done checking to see if all hosts have failed 11792 1727096129.06463: getting the remaining hosts for this loop 11792 1727096129.06464: done getting the remaining hosts for this loop 11792 1727096129.06466: getting the next task for host managed_node2 11792 1727096129.06475: done getting next task for host managed_node2 11792 1727096129.06477: ^ task is: TASK: Include network role 11792 1727096129.06480: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096129.06482: getting variables 11792 1727096129.06483: in VariableManager get_vars() 11792 1727096129.06495: Calling all_inventory to load vars for managed_node2 11792 1727096129.06497: Calling groups_inventory to load vars for managed_node2 11792 1727096129.06500: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.06505: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.06507: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.06509: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.06663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.06873: done with get_vars() 11792 1727096129.06882: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:3 Monday 23 September 2024 08:55:29 -0400 (0:00:00.063) 0:00:11.348 ****** 11792 1727096129.06961: entering _queue_task() for managed_node2/include_role 11792 1727096129.06963: Creating lock for include_role 11792 1727096129.07506: worker is 1 (out of 1 available) 11792 1727096129.07514: exiting _queue_task() for managed_node2/include_role 11792 1727096129.07525: done queuing things up, now waiting for results queue to drain 11792 1727096129.07527: waiting for pending results... 11792 1727096129.07771: running TaskExecutor() for managed_node2/TASK: Include network role 11792 1727096129.07775: in run() - task 0afff68d-5257-d9c7-3fc0-0000000001c5 11792 1727096129.07779: variable 'ansible_search_path' from source: unknown 11792 1727096129.07781: variable 'ansible_search_path' from source: unknown 11792 1727096129.07815: calling self._execute() 11792 1727096129.07906: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.07919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.07931: variable 'omit' from source: magic vars 11792 1727096129.08338: variable 'ansible_distribution_major_version' from source: facts 11792 1727096129.08357: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096129.08369: _execute() done 11792 1727096129.08410: dumping result to json 11792 1727096129.08414: done dumping result, returning 11792 1727096129.08417: done running TaskExecutor() for managed_node2/TASK: Include network role [0afff68d-5257-d9c7-3fc0-0000000001c5] 11792 1727096129.08419: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000001c5 11792 1727096129.08712: no more pending results, returning what we have 11792 1727096129.08717: in VariableManager get_vars() 11792 1727096129.08869: Calling all_inventory to load vars for managed_node2 11792 1727096129.08872: Calling groups_inventory to load vars for managed_node2 11792 1727096129.08875: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.08885: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.08888: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.08891: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.09392: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000001c5 11792 1727096129.09395: WORKER PROCESS EXITING 11792 1727096129.09418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.09807: done with get_vars() 11792 1727096129.09814: variable 'ansible_search_path' from source: unknown 11792 1727096129.09815: variable 'ansible_search_path' from source: unknown 11792 1727096129.10234: variable 'omit' from source: magic vars 11792 1727096129.10393: variable 'omit' from source: magic vars 11792 1727096129.10408: variable 'omit' from source: magic vars 11792 1727096129.10412: we have included files to process 11792 1727096129.10413: generating all_blocks data 11792 1727096129.10414: done generating all_blocks data 11792 1727096129.10416: processing included file: fedora.linux_system_roles.network 11792 1727096129.10437: in VariableManager get_vars() 11792 1727096129.10448: done with get_vars() 11792 1727096129.10634: in VariableManager get_vars() 11792 1727096129.10653: done with get_vars() 11792 1727096129.10808: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11792 1727096129.11367: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11792 1727096129.11632: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11792 1727096129.13188: in VariableManager get_vars() 11792 1727096129.13210: done with get_vars() 11792 1727096129.14065: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11792 1727096129.18265: iterating over new_blocks loaded from include file 11792 1727096129.18270: in VariableManager get_vars() 11792 1727096129.18365: done with get_vars() 11792 1727096129.18369: filtering new block on tags 11792 1727096129.19127: done filtering new block on tags 11792 1727096129.19132: in VariableManager get_vars() 11792 1727096129.19149: done with get_vars() 11792 1727096129.19155: filtering new block on tags 11792 1727096129.19273: done filtering new block on tags 11792 1727096129.19275: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 11792 1727096129.19281: extending task lists for all hosts with included blocks 11792 1727096129.19663: done extending task lists 11792 1727096129.19665: done processing included files 11792 1727096129.19666: results queue empty 11792 1727096129.19666: checking for any_errors_fatal 11792 1727096129.19673: done checking for any_errors_fatal 11792 1727096129.19674: checking for max_fail_percentage 11792 1727096129.19675: done checking for max_fail_percentage 11792 1727096129.19675: checking to see if all hosts have failed and the running result is not ok 11792 1727096129.19676: done checking to see if all hosts have failed 11792 1727096129.19677: getting the remaining hosts for this loop 11792 1727096129.19678: done getting the remaining hosts for this loop 11792 1727096129.19681: getting the next task for host managed_node2 11792 1727096129.19686: done getting next task for host managed_node2 11792 1727096129.19689: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11792 1727096129.19692: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096129.19817: getting variables 11792 1727096129.19819: in VariableManager get_vars() 11792 1727096129.19835: Calling all_inventory to load vars for managed_node2 11792 1727096129.19838: Calling groups_inventory to load vars for managed_node2 11792 1727096129.19840: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.19846: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.19858: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.19861: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.20639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.21173: done with get_vars() 11792 1727096129.21185: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:29 -0400 (0:00:00.143) 0:00:11.492 ****** 11792 1727096129.21322: entering _queue_task() for managed_node2/include_tasks 11792 1727096129.22166: worker is 1 (out of 1 available) 11792 1727096129.22180: exiting _queue_task() for managed_node2/include_tasks 11792 1727096129.22194: done queuing things up, now waiting for results queue to drain 11792 1727096129.22196: waiting for pending results... 11792 1727096129.22811: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11792 1727096129.23041: in run() - task 0afff68d-5257-d9c7-3fc0-000000000277 11792 1727096129.23071: variable 'ansible_search_path' from source: unknown 11792 1727096129.23194: variable 'ansible_search_path' from source: unknown 11792 1727096129.23205: calling self._execute() 11792 1727096129.23412: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.23420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.23485: variable 'omit' from source: magic vars 11792 1727096129.24284: variable 'ansible_distribution_major_version' from source: facts 11792 1727096129.24470: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096129.24578: _execute() done 11792 1727096129.24583: dumping result to json 11792 1727096129.24586: done dumping result, returning 11792 1727096129.24589: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-d9c7-3fc0-000000000277] 11792 1727096129.24591: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000277 11792 1727096129.24770: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000277 11792 1727096129.24776: WORKER PROCESS EXITING 11792 1727096129.24919: no more pending results, returning what we have 11792 1727096129.24924: in VariableManager get_vars() 11792 1727096129.24974: Calling all_inventory to load vars for managed_node2 11792 1727096129.24977: Calling groups_inventory to load vars for managed_node2 11792 1727096129.24980: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.24994: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.24997: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.25000: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.25644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.26121: done with get_vars() 11792 1727096129.26129: variable 'ansible_search_path' from source: unknown 11792 1727096129.26130: variable 'ansible_search_path' from source: unknown 11792 1727096129.26293: we have included files to process 11792 1727096129.26295: generating all_blocks data 11792 1727096129.26297: done generating all_blocks data 11792 1727096129.26301: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096129.26302: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096129.26304: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096129.27816: done processing included file 11792 1727096129.27819: iterating over new_blocks loaded from include file 11792 1727096129.27820: in VariableManager get_vars() 11792 1727096129.27847: done with get_vars() 11792 1727096129.27849: filtering new block on tags 11792 1727096129.27884: done filtering new block on tags 11792 1727096129.27888: in VariableManager get_vars() 11792 1727096129.27913: done with get_vars() 11792 1727096129.27915: filtering new block on tags 11792 1727096129.27971: done filtering new block on tags 11792 1727096129.27974: in VariableManager get_vars() 11792 1727096129.27996: done with get_vars() 11792 1727096129.27998: filtering new block on tags 11792 1727096129.28040: done filtering new block on tags 11792 1727096129.28043: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11792 1727096129.28049: extending task lists for all hosts with included blocks 11792 1727096129.30365: done extending task lists 11792 1727096129.30368: done processing included files 11792 1727096129.30369: results queue empty 11792 1727096129.30370: checking for any_errors_fatal 11792 1727096129.30373: done checking for any_errors_fatal 11792 1727096129.30374: checking for max_fail_percentage 11792 1727096129.30375: done checking for max_fail_percentage 11792 1727096129.30376: checking to see if all hosts have failed and the running result is not ok 11792 1727096129.30376: done checking to see if all hosts have failed 11792 1727096129.30377: getting the remaining hosts for this loop 11792 1727096129.30379: done getting the remaining hosts for this loop 11792 1727096129.30381: getting the next task for host managed_node2 11792 1727096129.30387: done getting next task for host managed_node2 11792 1727096129.30390: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11792 1727096129.30394: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096129.30404: getting variables 11792 1727096129.30405: in VariableManager get_vars() 11792 1727096129.30421: Calling all_inventory to load vars for managed_node2 11792 1727096129.30424: Calling groups_inventory to load vars for managed_node2 11792 1727096129.30425: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.30430: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.30433: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.30435: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.30578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.30787: done with get_vars() 11792 1727096129.30800: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:55:29 -0400 (0:00:00.095) 0:00:11.587 ****** 11792 1727096129.30879: entering _queue_task() for managed_node2/setup 11792 1727096129.31209: worker is 1 (out of 1 available) 11792 1727096129.31221: exiting _queue_task() for managed_node2/setup 11792 1727096129.31235: done queuing things up, now waiting for results queue to drain 11792 1727096129.31236: waiting for pending results... 11792 1727096129.31821: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11792 1727096129.32063: in run() - task 0afff68d-5257-d9c7-3fc0-0000000002d4 11792 1727096129.32069: variable 'ansible_search_path' from source: unknown 11792 1727096129.32072: variable 'ansible_search_path' from source: unknown 11792 1727096129.32275: calling self._execute() 11792 1727096129.32317: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.32326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.32337: variable 'omit' from source: magic vars 11792 1727096129.33225: variable 'ansible_distribution_major_version' from source: facts 11792 1727096129.33236: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096129.33556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096129.36330: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096129.36424: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096129.36473: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096129.36527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096129.36552: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096129.36743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096129.36747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096129.36752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096129.36756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096129.36774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096129.36833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096129.36865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096129.36899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096129.36943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096129.36970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096129.37128: variable '__network_required_facts' from source: role '' defaults 11792 1727096129.37144: variable 'ansible_facts' from source: unknown 11792 1727096129.37243: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11792 1727096129.37259: when evaluation is False, skipping this task 11792 1727096129.37298: _execute() done 11792 1727096129.37427: dumping result to json 11792 1727096129.37430: done dumping result, returning 11792 1727096129.37433: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-d9c7-3fc0-0000000002d4] 11792 1727096129.37436: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002d4 11792 1727096129.37514: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002d4 11792 1727096129.37518: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096129.37566: no more pending results, returning what we have 11792 1727096129.37574: results queue empty 11792 1727096129.37575: checking for any_errors_fatal 11792 1727096129.37577: done checking for any_errors_fatal 11792 1727096129.37577: checking for max_fail_percentage 11792 1727096129.37579: done checking for max_fail_percentage 11792 1727096129.37580: checking to see if all hosts have failed and the running result is not ok 11792 1727096129.37580: done checking to see if all hosts have failed 11792 1727096129.37581: getting the remaining hosts for this loop 11792 1727096129.37583: done getting the remaining hosts for this loop 11792 1727096129.37587: getting the next task for host managed_node2 11792 1727096129.37598: done getting next task for host managed_node2 11792 1727096129.37601: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11792 1727096129.37610: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096129.37645: getting variables 11792 1727096129.37647: in VariableManager get_vars() 11792 1727096129.37686: Calling all_inventory to load vars for managed_node2 11792 1727096129.37689: Calling groups_inventory to load vars for managed_node2 11792 1727096129.37691: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.37701: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.37704: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.37713: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.38186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.38728: done with get_vars() 11792 1727096129.38741: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:55:29 -0400 (0:00:00.079) 0:00:11.667 ****** 11792 1727096129.38872: entering _queue_task() for managed_node2/stat 11792 1727096129.39217: worker is 1 (out of 1 available) 11792 1727096129.39229: exiting _queue_task() for managed_node2/stat 11792 1727096129.39242: done queuing things up, now waiting for results queue to drain 11792 1727096129.39243: waiting for pending results... 11792 1727096129.39687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11792 1727096129.39696: in run() - task 0afff68d-5257-d9c7-3fc0-0000000002d6 11792 1727096129.39701: variable 'ansible_search_path' from source: unknown 11792 1727096129.39704: variable 'ansible_search_path' from source: unknown 11792 1727096129.39724: calling self._execute() 11792 1727096129.39809: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.39814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.39829: variable 'omit' from source: magic vars 11792 1727096129.40224: variable 'ansible_distribution_major_version' from source: facts 11792 1727096129.40245: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096129.40427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096129.40747: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096129.40813: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096129.40856: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096129.40895: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096129.41173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096129.41176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096129.41179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096129.41182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096129.41277: variable '__network_is_ostree' from source: set_fact 11792 1727096129.41284: Evaluated conditional (not __network_is_ostree is defined): False 11792 1727096129.41287: when evaluation is False, skipping this task 11792 1727096129.41290: _execute() done 11792 1727096129.41292: dumping result to json 11792 1727096129.41297: done dumping result, returning 11792 1727096129.41305: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-d9c7-3fc0-0000000002d6] 11792 1727096129.41308: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002d6 11792 1727096129.41414: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002d6 11792 1727096129.41416: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11792 1727096129.41501: no more pending results, returning what we have 11792 1727096129.41505: results queue empty 11792 1727096129.41506: checking for any_errors_fatal 11792 1727096129.41515: done checking for any_errors_fatal 11792 1727096129.41516: checking for max_fail_percentage 11792 1727096129.41518: done checking for max_fail_percentage 11792 1727096129.41518: checking to see if all hosts have failed and the running result is not ok 11792 1727096129.41519: done checking to see if all hosts have failed 11792 1727096129.41520: getting the remaining hosts for this loop 11792 1727096129.41522: done getting the remaining hosts for this loop 11792 1727096129.41525: getting the next task for host managed_node2 11792 1727096129.41578: done getting next task for host managed_node2 11792 1727096129.41582: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11792 1727096129.41590: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096129.41604: getting variables 11792 1727096129.41606: in VariableManager get_vars() 11792 1727096129.41779: Calling all_inventory to load vars for managed_node2 11792 1727096129.41782: Calling groups_inventory to load vars for managed_node2 11792 1727096129.41784: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.41794: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.41797: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.41800: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.42107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.42403: done with get_vars() 11792 1727096129.42421: done getting variables 11792 1727096129.42483: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:55:29 -0400 (0:00:00.036) 0:00:11.704 ****** 11792 1727096129.42542: entering _queue_task() for managed_node2/set_fact 11792 1727096129.43098: worker is 1 (out of 1 available) 11792 1727096129.43107: exiting _queue_task() for managed_node2/set_fact 11792 1727096129.43120: done queuing things up, now waiting for results queue to drain 11792 1727096129.43121: waiting for pending results... 11792 1727096129.43289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11792 1727096129.43388: in run() - task 0afff68d-5257-d9c7-3fc0-0000000002d7 11792 1727096129.43403: variable 'ansible_search_path' from source: unknown 11792 1727096129.43407: variable 'ansible_search_path' from source: unknown 11792 1727096129.43444: calling self._execute() 11792 1727096129.43674: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.43678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.43681: variable 'omit' from source: magic vars 11792 1727096129.44077: variable 'ansible_distribution_major_version' from source: facts 11792 1727096129.44090: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096129.44293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096129.44619: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096129.44663: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096129.44711: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096129.44745: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096129.44845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096129.44874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096129.44911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096129.44949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096129.45059: variable '__network_is_ostree' from source: set_fact 11792 1727096129.45070: Evaluated conditional (not __network_is_ostree is defined): False 11792 1727096129.45073: when evaluation is False, skipping this task 11792 1727096129.45076: _execute() done 11792 1727096129.45078: dumping result to json 11792 1727096129.45081: done dumping result, returning 11792 1727096129.45090: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-d9c7-3fc0-0000000002d7] 11792 1727096129.45093: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002d7 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11792 1727096129.45282: no more pending results, returning what we have 11792 1727096129.45286: results queue empty 11792 1727096129.45287: checking for any_errors_fatal 11792 1727096129.45292: done checking for any_errors_fatal 11792 1727096129.45293: checking for max_fail_percentage 11792 1727096129.45295: done checking for max_fail_percentage 11792 1727096129.45296: checking to see if all hosts have failed and the running result is not ok 11792 1727096129.45296: done checking to see if all hosts have failed 11792 1727096129.45297: getting the remaining hosts for this loop 11792 1727096129.45299: done getting the remaining hosts for this loop 11792 1727096129.45303: getting the next task for host managed_node2 11792 1727096129.45475: done getting next task for host managed_node2 11792 1727096129.45480: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11792 1727096129.45487: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096129.45504: getting variables 11792 1727096129.45506: in VariableManager get_vars() 11792 1727096129.45540: Calling all_inventory to load vars for managed_node2 11792 1727096129.45543: Calling groups_inventory to load vars for managed_node2 11792 1727096129.45545: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096129.45554: Calling all_plugins_play to load vars for managed_node2 11792 1727096129.45557: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096129.45561: Calling groups_plugins_play to load vars for managed_node2 11792 1727096129.45889: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002d7 11792 1727096129.45894: WORKER PROCESS EXITING 11792 1727096129.45915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096129.46162: done with get_vars() 11792 1727096129.46176: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:55:29 -0400 (0:00:00.037) 0:00:11.742 ****** 11792 1727096129.46330: entering _queue_task() for managed_node2/service_facts 11792 1727096129.46332: Creating lock for service_facts 11792 1727096129.46719: worker is 1 (out of 1 available) 11792 1727096129.46731: exiting _queue_task() for managed_node2/service_facts 11792 1727096129.46747: done queuing things up, now waiting for results queue to drain 11792 1727096129.46749: waiting for pending results... 11792 1727096129.47086: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11792 1727096129.47165: in run() - task 0afff68d-5257-d9c7-3fc0-0000000002d9 11792 1727096129.47181: variable 'ansible_search_path' from source: unknown 11792 1727096129.47185: variable 'ansible_search_path' from source: unknown 11792 1727096129.47231: calling self._execute() 11792 1727096129.47322: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.47333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.47472: variable 'omit' from source: magic vars 11792 1727096129.48081: variable 'ansible_distribution_major_version' from source: facts 11792 1727096129.48098: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096129.48104: variable 'omit' from source: magic vars 11792 1727096129.48191: variable 'omit' from source: magic vars 11792 1727096129.48232: variable 'omit' from source: magic vars 11792 1727096129.48274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096129.48314: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096129.48337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096129.48356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096129.48366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096129.48397: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096129.48400: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.48403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.48508: Set connection var ansible_timeout to 10 11792 1727096129.48517: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096129.48531: Set connection var ansible_shell_executable to /bin/sh 11792 1727096129.48536: Set connection var ansible_pipelining to False 11792 1727096129.48539: Set connection var ansible_shell_type to sh 11792 1727096129.48542: Set connection var ansible_connection to ssh 11792 1727096129.48570: variable 'ansible_shell_executable' from source: unknown 11792 1727096129.48574: variable 'ansible_connection' from source: unknown 11792 1727096129.48577: variable 'ansible_module_compression' from source: unknown 11792 1727096129.48579: variable 'ansible_shell_type' from source: unknown 11792 1727096129.48581: variable 'ansible_shell_executable' from source: unknown 11792 1727096129.48584: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096129.48586: variable 'ansible_pipelining' from source: unknown 11792 1727096129.48772: variable 'ansible_timeout' from source: unknown 11792 1727096129.48775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096129.48798: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096129.48808: variable 'omit' from source: magic vars 11792 1727096129.48814: starting attempt loop 11792 1727096129.48817: running the handler 11792 1727096129.48831: _low_level_execute_command(): starting 11792 1727096129.48839: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096129.49682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096129.49726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096129.49754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096129.49828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096129.51555: stdout chunk (state=3): >>>/root <<< 11792 1727096129.51722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096129.51726: stdout chunk (state=3): >>><<< 11792 1727096129.51729: stderr chunk (state=3): >>><<< 11792 1727096129.51753: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096129.51777: _low_level_execute_command(): starting 11792 1727096129.51831: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605 `" && echo ansible-tmp-1727096129.517612-12336-208321004620605="` echo /root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605 `" ) && sleep 0' 11792 1727096129.52578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096129.52581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096129.52583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096129.52586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096129.52588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096129.52590: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096129.52592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096129.52602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096129.52604: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096129.52607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096129.52609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096129.52611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096129.52613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096129.52615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096129.52617: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096129.52619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096129.52621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096129.52623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096129.52686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096129.52750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096129.54803: stdout chunk (state=3): >>>ansible-tmp-1727096129.517612-12336-208321004620605=/root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605 <<< 11792 1727096129.54940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096129.54944: stdout chunk (state=3): >>><<< 11792 1727096129.54946: stderr chunk (state=3): >>><<< 11792 1727096129.55175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096129.517612-12336-208321004620605=/root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096129.55179: variable 'ansible_module_compression' from source: unknown 11792 1727096129.55182: ANSIBALLZ: Using lock for service_facts 11792 1727096129.55184: ANSIBALLZ: Acquiring lock 11792 1727096129.55186: ANSIBALLZ: Lock acquired: 139635226342176 11792 1727096129.55188: ANSIBALLZ: Creating module 11792 1727096129.69452: ANSIBALLZ: Writing module into payload 11792 1727096129.69576: ANSIBALLZ: Writing module 11792 1727096129.69676: ANSIBALLZ: Renaming module 11792 1727096129.69681: ANSIBALLZ: Done creating module 11792 1727096129.69683: variable 'ansible_facts' from source: unknown 11792 1727096129.69726: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/AnsiballZ_service_facts.py 11792 1727096129.69919: Sending initial data 11792 1727096129.69922: Sent initial data (161 bytes) 11792 1727096129.70634: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096129.71040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096129.71066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096129.71093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096129.71173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096129.72880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096129.72947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096129.73008: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpzk1cpcox /root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/AnsiballZ_service_facts.py <<< 11792 1727096129.73011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/AnsiballZ_service_facts.py" <<< 11792 1727096129.73062: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpzk1cpcox" to remote "/root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/AnsiballZ_service_facts.py" <<< 11792 1727096129.73832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096129.73852: stdout chunk (state=3): >>><<< 11792 1727096129.73866: stderr chunk (state=3): >>><<< 11792 1727096129.73917: done transferring module to remote 11792 1727096129.73943: _low_level_execute_command(): starting 11792 1727096129.73957: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/ /root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/AnsiballZ_service_facts.py && sleep 0' 11792 1727096129.75102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096129.75274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096129.75484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096129.75696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096129.77588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096129.77639: stderr chunk (state=3): >>><<< 11792 1727096129.77786: stdout chunk (state=3): >>><<< 11792 1727096129.77804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096129.77807: _low_level_execute_command(): starting 11792 1727096129.77813: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/AnsiballZ_service_facts.py && sleep 0' 11792 1727096129.79262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096129.79266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096129.79294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096129.79300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096129.79475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096129.79499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096129.79566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096131.52078: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 11792 1727096131.52090: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 11792 1727096131.52094: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11792 1727096131.53677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096131.53691: stdout chunk (state=3): >>><<< 11792 1727096131.53705: stderr chunk (state=3): >>><<< 11792 1727096131.53801: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096131.55079: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096131.55363: _low_level_execute_command(): starting 11792 1727096131.55367: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096129.517612-12336-208321004620605/ > /dev/null 2>&1 && sleep 0' 11792 1727096131.56466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096131.56472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096131.56476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096131.56678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096131.56716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096131.56775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096131.58758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096131.58762: stdout chunk (state=3): >>><<< 11792 1727096131.58765: stderr chunk (state=3): >>><<< 11792 1727096131.58789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096131.58795: handler run complete 11792 1727096131.59212: variable 'ansible_facts' from source: unknown 11792 1727096131.59357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096131.59885: variable 'ansible_facts' from source: unknown 11792 1727096131.61209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096131.61586: attempt loop complete, returning result 11792 1727096131.61672: _execute() done 11792 1727096131.61676: dumping result to json 11792 1727096131.61678: done dumping result, returning 11792 1727096131.61708: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-d9c7-3fc0-0000000002d9] 11792 1727096131.61721: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002d9 11792 1727096131.62682: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002d9 11792 1727096131.62686: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096131.62745: no more pending results, returning what we have 11792 1727096131.62748: results queue empty 11792 1727096131.62749: checking for any_errors_fatal 11792 1727096131.62754: done checking for any_errors_fatal 11792 1727096131.62755: checking for max_fail_percentage 11792 1727096131.62756: done checking for max_fail_percentage 11792 1727096131.62757: checking to see if all hosts have failed and the running result is not ok 11792 1727096131.62757: done checking to see if all hosts have failed 11792 1727096131.62758: getting the remaining hosts for this loop 11792 1727096131.62759: done getting the remaining hosts for this loop 11792 1727096131.62763: getting the next task for host managed_node2 11792 1727096131.62772: done getting next task for host managed_node2 11792 1727096131.62775: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11792 1727096131.62782: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096131.62792: getting variables 11792 1727096131.62794: in VariableManager get_vars() 11792 1727096131.62830: Calling all_inventory to load vars for managed_node2 11792 1727096131.62833: Calling groups_inventory to load vars for managed_node2 11792 1727096131.62839: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096131.62855: Calling all_plugins_play to load vars for managed_node2 11792 1727096131.62858: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096131.62861: Calling groups_plugins_play to load vars for managed_node2 11792 1727096131.63562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096131.64096: done with get_vars() 11792 1727096131.64113: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:55:31 -0400 (0:00:02.178) 0:00:13.921 ****** 11792 1727096131.64225: entering _queue_task() for managed_node2/package_facts 11792 1727096131.64228: Creating lock for package_facts 11792 1727096131.64592: worker is 1 (out of 1 available) 11792 1727096131.64717: exiting _queue_task() for managed_node2/package_facts 11792 1727096131.64731: done queuing things up, now waiting for results queue to drain 11792 1727096131.64732: waiting for pending results... 11792 1727096131.65286: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11792 1727096131.65292: in run() - task 0afff68d-5257-d9c7-3fc0-0000000002da 11792 1727096131.65480: variable 'ansible_search_path' from source: unknown 11792 1727096131.65484: variable 'ansible_search_path' from source: unknown 11792 1727096131.65487: calling self._execute() 11792 1727096131.65623: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096131.65684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096131.65706: variable 'omit' from source: magic vars 11792 1727096131.66397: variable 'ansible_distribution_major_version' from source: facts 11792 1727096131.66409: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096131.66416: variable 'omit' from source: magic vars 11792 1727096131.66514: variable 'omit' from source: magic vars 11792 1727096131.66546: variable 'omit' from source: magic vars 11792 1727096131.66591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096131.66628: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096131.66647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096131.66679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096131.66777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096131.66780: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096131.66784: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096131.66787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096131.66819: Set connection var ansible_timeout to 10 11792 1727096131.66828: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096131.66837: Set connection var ansible_shell_executable to /bin/sh 11792 1727096131.66881: Set connection var ansible_pipelining to False 11792 1727096131.66888: Set connection var ansible_shell_type to sh 11792 1727096131.66890: Set connection var ansible_connection to ssh 11792 1727096131.66893: variable 'ansible_shell_executable' from source: unknown 11792 1727096131.66971: variable 'ansible_connection' from source: unknown 11792 1727096131.66977: variable 'ansible_module_compression' from source: unknown 11792 1727096131.66979: variable 'ansible_shell_type' from source: unknown 11792 1727096131.66982: variable 'ansible_shell_executable' from source: unknown 11792 1727096131.66984: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096131.66986: variable 'ansible_pipelining' from source: unknown 11792 1727096131.66988: variable 'ansible_timeout' from source: unknown 11792 1727096131.66993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096131.67117: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096131.67122: variable 'omit' from source: magic vars 11792 1727096131.67131: starting attempt loop 11792 1727096131.67134: running the handler 11792 1727096131.67157: _low_level_execute_command(): starting 11792 1727096131.67163: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096131.67695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096131.67700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096131.67742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096131.67762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096131.67798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096131.69564: stdout chunk (state=3): >>>/root <<< 11792 1727096131.69721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096131.69725: stdout chunk (state=3): >>><<< 11792 1727096131.69728: stderr chunk (state=3): >>><<< 11792 1727096131.69748: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096131.69772: _low_level_execute_command(): starting 11792 1727096131.69786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124 `" && echo ansible-tmp-1727096131.6975586-12413-165143550382124="` echo /root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124 `" ) && sleep 0' 11792 1727096131.70390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096131.70405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096131.70427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096131.70444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096131.70463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096131.70488: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096131.70523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096131.70554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096131.70606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096131.70610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096131.70614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096131.70653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096131.72711: stdout chunk (state=3): >>>ansible-tmp-1727096131.6975586-12413-165143550382124=/root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124 <<< 11792 1727096131.72817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096131.72840: stdout chunk (state=3): >>><<< 11792 1727096131.72843: stderr chunk (state=3): >>><<< 11792 1727096131.73073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096131.6975586-12413-165143550382124=/root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096131.73078: variable 'ansible_module_compression' from source: unknown 11792 1727096131.73080: ANSIBALLZ: Using lock for package_facts 11792 1727096131.73082: ANSIBALLZ: Acquiring lock 11792 1727096131.73084: ANSIBALLZ: Lock acquired: 139635222912208 11792 1727096131.73086: ANSIBALLZ: Creating module 11792 1727096131.94197: ANSIBALLZ: Writing module into payload 11792 1727096131.94326: ANSIBALLZ: Writing module 11792 1727096131.94355: ANSIBALLZ: Renaming module 11792 1727096131.94359: ANSIBALLZ: Done creating module 11792 1727096131.94384: variable 'ansible_facts' from source: unknown 11792 1727096131.94572: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/AnsiballZ_package_facts.py 11792 1727096131.94826: Sending initial data 11792 1727096131.94829: Sent initial data (162 bytes) 11792 1727096131.96128: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096131.96200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096131.96204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096131.96258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096131.96436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096131.98043: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096131.98118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096131.98177: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpdcymivou /root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/AnsiballZ_package_facts.py <<< 11792 1727096131.98182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/AnsiballZ_package_facts.py" <<< 11792 1727096131.98217: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpdcymivou" to remote "/root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/AnsiballZ_package_facts.py" <<< 11792 1727096131.99889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096131.99909: stderr chunk (state=3): >>><<< 11792 1727096132.00096: stdout chunk (state=3): >>><<< 11792 1727096132.00100: done transferring module to remote 11792 1727096132.00102: _low_level_execute_command(): starting 11792 1727096132.00105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/ /root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/AnsiballZ_package_facts.py && sleep 0' 11792 1727096132.01388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096132.01416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096132.01481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096132.03302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096132.03335: stderr chunk (state=3): >>><<< 11792 1727096132.03374: stdout chunk (state=3): >>><<< 11792 1727096132.03396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096132.03555: _low_level_execute_command(): starting 11792 1727096132.03559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/AnsiballZ_package_facts.py && sleep 0' 11792 1727096132.04600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096132.04614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096132.04682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096132.04896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096132.04899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096132.04983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096132.50080: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11792 1727096132.50105: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11792 1727096132.50182: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 11792 1727096132.50198: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 11792 1727096132.50250: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11792 1727096132.50257: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 11792 1727096132.50300: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11792 1727096132.50315: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11792 1727096132.52176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096132.52210: stderr chunk (state=3): >>><<< 11792 1727096132.52214: stdout chunk (state=3): >>><<< 11792 1727096132.52252: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096132.53978: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096132.53996: _low_level_execute_command(): starting 11792 1727096132.53999: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096131.6975586-12413-165143550382124/ > /dev/null 2>&1 && sleep 0' 11792 1727096132.54449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096132.54456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096132.54487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096132.54491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096132.54493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096132.54495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096132.54543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096132.54546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096132.54549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096132.54605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096132.56571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096132.56575: stdout chunk (state=3): >>><<< 11792 1727096132.56578: stderr chunk (state=3): >>><<< 11792 1727096132.56774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096132.56778: handler run complete 11792 1727096132.57703: variable 'ansible_facts' from source: unknown 11792 1727096132.57944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096132.58993: variable 'ansible_facts' from source: unknown 11792 1727096132.59223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096132.59919: attempt loop complete, returning result 11792 1727096132.59922: _execute() done 11792 1727096132.59925: dumping result to json 11792 1727096132.60214: done dumping result, returning 11792 1727096132.60217: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-d9c7-3fc0-0000000002da] 11792 1727096132.60219: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002da 11792 1727096132.66840: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000002da 11792 1727096132.66844: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096132.66948: no more pending results, returning what we have 11792 1727096132.66951: results queue empty 11792 1727096132.66952: checking for any_errors_fatal 11792 1727096132.66956: done checking for any_errors_fatal 11792 1727096132.66956: checking for max_fail_percentage 11792 1727096132.66958: done checking for max_fail_percentage 11792 1727096132.66958: checking to see if all hosts have failed and the running result is not ok 11792 1727096132.66959: done checking to see if all hosts have failed 11792 1727096132.66960: getting the remaining hosts for this loop 11792 1727096132.66961: done getting the remaining hosts for this loop 11792 1727096132.66965: getting the next task for host managed_node2 11792 1727096132.66975: done getting next task for host managed_node2 11792 1727096132.66979: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11792 1727096132.66985: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096132.66994: getting variables 11792 1727096132.66995: in VariableManager get_vars() 11792 1727096132.67020: Calling all_inventory to load vars for managed_node2 11792 1727096132.67023: Calling groups_inventory to load vars for managed_node2 11792 1727096132.67025: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096132.67039: Calling all_plugins_play to load vars for managed_node2 11792 1727096132.67043: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096132.67046: Calling groups_plugins_play to load vars for managed_node2 11792 1727096132.68215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096132.69791: done with get_vars() 11792 1727096132.69818: done getting variables 11792 1727096132.69887: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:32 -0400 (0:00:01.056) 0:00:14.978 ****** 11792 1727096132.69925: entering _queue_task() for managed_node2/debug 11792 1727096132.70253: worker is 1 (out of 1 available) 11792 1727096132.70266: exiting _queue_task() for managed_node2/debug 11792 1727096132.70383: done queuing things up, now waiting for results queue to drain 11792 1727096132.70385: waiting for pending results... 11792 1727096132.70607: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11792 1727096132.70770: in run() - task 0afff68d-5257-d9c7-3fc0-000000000278 11792 1727096132.70776: variable 'ansible_search_path' from source: unknown 11792 1727096132.70778: variable 'ansible_search_path' from source: unknown 11792 1727096132.70789: calling self._execute() 11792 1727096132.70880: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096132.70897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096132.70909: variable 'omit' from source: magic vars 11792 1727096132.71293: variable 'ansible_distribution_major_version' from source: facts 11792 1727096132.71309: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096132.71325: variable 'omit' from source: magic vars 11792 1727096132.71433: variable 'omit' from source: magic vars 11792 1727096132.71501: variable 'network_provider' from source: set_fact 11792 1727096132.71524: variable 'omit' from source: magic vars 11792 1727096132.71580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096132.71618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096132.71657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096132.71671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096132.71688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096132.71766: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096132.71770: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096132.71773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096132.71841: Set connection var ansible_timeout to 10 11792 1727096132.71854: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096132.71871: Set connection var ansible_shell_executable to /bin/sh 11792 1727096132.71888: Set connection var ansible_pipelining to False 11792 1727096132.71895: Set connection var ansible_shell_type to sh 11792 1727096132.71901: Set connection var ansible_connection to ssh 11792 1727096132.71926: variable 'ansible_shell_executable' from source: unknown 11792 1727096132.71999: variable 'ansible_connection' from source: unknown 11792 1727096132.72002: variable 'ansible_module_compression' from source: unknown 11792 1727096132.72004: variable 'ansible_shell_type' from source: unknown 11792 1727096132.72006: variable 'ansible_shell_executable' from source: unknown 11792 1727096132.72008: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096132.72010: variable 'ansible_pipelining' from source: unknown 11792 1727096132.72012: variable 'ansible_timeout' from source: unknown 11792 1727096132.72014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096132.72120: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096132.72141: variable 'omit' from source: magic vars 11792 1727096132.72238: starting attempt loop 11792 1727096132.72241: running the handler 11792 1727096132.72244: handler run complete 11792 1727096132.72246: attempt loop complete, returning result 11792 1727096132.72248: _execute() done 11792 1727096132.72250: dumping result to json 11792 1727096132.72253: done dumping result, returning 11792 1727096132.72255: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-d9c7-3fc0-000000000278] 11792 1727096132.72257: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000278 11792 1727096132.72575: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000278 11792 1727096132.72579: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 11792 1727096132.72635: no more pending results, returning what we have 11792 1727096132.72639: results queue empty 11792 1727096132.72639: checking for any_errors_fatal 11792 1727096132.72647: done checking for any_errors_fatal 11792 1727096132.72648: checking for max_fail_percentage 11792 1727096132.72650: done checking for max_fail_percentage 11792 1727096132.72650: checking to see if all hosts have failed and the running result is not ok 11792 1727096132.72651: done checking to see if all hosts have failed 11792 1727096132.72652: getting the remaining hosts for this loop 11792 1727096132.72653: done getting the remaining hosts for this loop 11792 1727096132.72657: getting the next task for host managed_node2 11792 1727096132.72664: done getting next task for host managed_node2 11792 1727096132.72671: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11792 1727096132.72677: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096132.72693: getting variables 11792 1727096132.72694: in VariableManager get_vars() 11792 1727096132.72732: Calling all_inventory to load vars for managed_node2 11792 1727096132.72735: Calling groups_inventory to load vars for managed_node2 11792 1727096132.72737: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096132.72746: Calling all_plugins_play to load vars for managed_node2 11792 1727096132.72749: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096132.72752: Calling groups_plugins_play to load vars for managed_node2 11792 1727096132.74321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096132.75836: done with get_vars() 11792 1727096132.75860: done getting variables 11792 1727096132.75934: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:32 -0400 (0:00:00.060) 0:00:15.038 ****** 11792 1727096132.75964: entering _queue_task() for managed_node2/fail 11792 1727096132.75965: Creating lock for fail 11792 1727096132.76213: worker is 1 (out of 1 available) 11792 1727096132.76228: exiting _queue_task() for managed_node2/fail 11792 1727096132.76242: done queuing things up, now waiting for results queue to drain 11792 1727096132.76244: waiting for pending results... 11792 1727096132.76421: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11792 1727096132.76515: in run() - task 0afff68d-5257-d9c7-3fc0-000000000279 11792 1727096132.76527: variable 'ansible_search_path' from source: unknown 11792 1727096132.76531: variable 'ansible_search_path' from source: unknown 11792 1727096132.76562: calling self._execute() 11792 1727096132.76631: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096132.76634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096132.76644: variable 'omit' from source: magic vars 11792 1727096132.76918: variable 'ansible_distribution_major_version' from source: facts 11792 1727096132.76929: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096132.77014: variable 'network_state' from source: role '' defaults 11792 1727096132.77027: Evaluated conditional (network_state != {}): False 11792 1727096132.77030: when evaluation is False, skipping this task 11792 1727096132.77034: _execute() done 11792 1727096132.77036: dumping result to json 11792 1727096132.77039: done dumping result, returning 11792 1727096132.77042: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-d9c7-3fc0-000000000279] 11792 1727096132.77045: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000279 11792 1727096132.77132: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000279 11792 1727096132.77134: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096132.77187: no more pending results, returning what we have 11792 1727096132.77190: results queue empty 11792 1727096132.77191: checking for any_errors_fatal 11792 1727096132.77197: done checking for any_errors_fatal 11792 1727096132.77197: checking for max_fail_percentage 11792 1727096132.77199: done checking for max_fail_percentage 11792 1727096132.77200: checking to see if all hosts have failed and the running result is not ok 11792 1727096132.77200: done checking to see if all hosts have failed 11792 1727096132.77201: getting the remaining hosts for this loop 11792 1727096132.77202: done getting the remaining hosts for this loop 11792 1727096132.77206: getting the next task for host managed_node2 11792 1727096132.77213: done getting next task for host managed_node2 11792 1727096132.77216: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11792 1727096132.77223: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096132.77237: getting variables 11792 1727096132.77239: in VariableManager get_vars() 11792 1727096132.77284: Calling all_inventory to load vars for managed_node2 11792 1727096132.77287: Calling groups_inventory to load vars for managed_node2 11792 1727096132.77289: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096132.77297: Calling all_plugins_play to load vars for managed_node2 11792 1727096132.77299: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096132.77302: Calling groups_plugins_play to load vars for managed_node2 11792 1727096132.78595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096132.79469: done with get_vars() 11792 1727096132.79492: done getting variables 11792 1727096132.79538: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:32 -0400 (0:00:00.035) 0:00:15.074 ****** 11792 1727096132.79565: entering _queue_task() for managed_node2/fail 11792 1727096132.79811: worker is 1 (out of 1 available) 11792 1727096132.79827: exiting _queue_task() for managed_node2/fail 11792 1727096132.79841: done queuing things up, now waiting for results queue to drain 11792 1727096132.79843: waiting for pending results... 11792 1727096132.80019: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11792 1727096132.80112: in run() - task 0afff68d-5257-d9c7-3fc0-00000000027a 11792 1727096132.80123: variable 'ansible_search_path' from source: unknown 11792 1727096132.80126: variable 'ansible_search_path' from source: unknown 11792 1727096132.80157: calling self._execute() 11792 1727096132.80252: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096132.80256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096132.80260: variable 'omit' from source: magic vars 11792 1727096132.80550: variable 'ansible_distribution_major_version' from source: facts 11792 1727096132.80561: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096132.80773: variable 'network_state' from source: role '' defaults 11792 1727096132.80777: Evaluated conditional (network_state != {}): False 11792 1727096132.80779: when evaluation is False, skipping this task 11792 1727096132.80782: _execute() done 11792 1727096132.80784: dumping result to json 11792 1727096132.80786: done dumping result, returning 11792 1727096132.80788: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-d9c7-3fc0-00000000027a] 11792 1727096132.80791: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027a 11792 1727096132.80866: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027a 11792 1727096132.80872: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096132.80927: no more pending results, returning what we have 11792 1727096132.80931: results queue empty 11792 1727096132.80931: checking for any_errors_fatal 11792 1727096132.80939: done checking for any_errors_fatal 11792 1727096132.80939: checking for max_fail_percentage 11792 1727096132.80941: done checking for max_fail_percentage 11792 1727096132.80942: checking to see if all hosts have failed and the running result is not ok 11792 1727096132.80942: done checking to see if all hosts have failed 11792 1727096132.80943: getting the remaining hosts for this loop 11792 1727096132.80944: done getting the remaining hosts for this loop 11792 1727096132.80948: getting the next task for host managed_node2 11792 1727096132.80956: done getting next task for host managed_node2 11792 1727096132.80959: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11792 1727096132.80965: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096132.80982: getting variables 11792 1727096132.80983: in VariableManager get_vars() 11792 1727096132.81026: Calling all_inventory to load vars for managed_node2 11792 1727096132.81030: Calling groups_inventory to load vars for managed_node2 11792 1727096132.81032: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096132.81043: Calling all_plugins_play to load vars for managed_node2 11792 1727096132.81045: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096132.81047: Calling groups_plugins_play to load vars for managed_node2 11792 1727096132.82827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096132.83821: done with get_vars() 11792 1727096132.83841: done getting variables 11792 1727096132.83888: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:32 -0400 (0:00:00.043) 0:00:15.118 ****** 11792 1727096132.83927: entering _queue_task() for managed_node2/fail 11792 1727096132.84277: worker is 1 (out of 1 available) 11792 1727096132.84288: exiting _queue_task() for managed_node2/fail 11792 1727096132.84300: done queuing things up, now waiting for results queue to drain 11792 1727096132.84301: waiting for pending results... 11792 1727096132.84616: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11792 1727096132.84687: in run() - task 0afff68d-5257-d9c7-3fc0-00000000027b 11792 1727096132.84715: variable 'ansible_search_path' from source: unknown 11792 1727096132.84720: variable 'ansible_search_path' from source: unknown 11792 1727096132.84747: calling self._execute() 11792 1727096132.84842: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096132.84845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096132.84848: variable 'omit' from source: magic vars 11792 1727096132.85371: variable 'ansible_distribution_major_version' from source: facts 11792 1727096132.85375: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096132.85485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096132.87736: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096132.87823: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096132.87870: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096132.87915: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096132.87946: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096132.88092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096132.88096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096132.88098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.88141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096132.88162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096132.88274: variable 'ansible_distribution_major_version' from source: facts 11792 1727096132.88294: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11792 1727096132.88420: variable 'ansible_distribution' from source: facts 11792 1727096132.88472: variable '__network_rh_distros' from source: role '' defaults 11792 1727096132.88475: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11792 1727096132.88713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096132.88741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096132.88779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.88826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096132.88844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096132.88908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096132.88972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096132.88975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.89008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096132.89029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096132.89081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096132.89108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096132.89134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.89188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096132.89374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096132.89535: variable 'network_connections' from source: include params 11792 1727096132.89553: variable 'controller_profile' from source: play vars 11792 1727096132.89624: variable 'controller_profile' from source: play vars 11792 1727096132.89639: variable 'controller_device' from source: play vars 11792 1727096132.89709: variable 'controller_device' from source: play vars 11792 1727096132.89729: variable 'port1_profile' from source: play vars 11792 1727096132.89793: variable 'port1_profile' from source: play vars 11792 1727096132.89805: variable 'dhcp_interface1' from source: play vars 11792 1727096132.89875: variable 'dhcp_interface1' from source: play vars 11792 1727096132.89888: variable 'controller_profile' from source: play vars 11792 1727096132.89955: variable 'controller_profile' from source: play vars 11792 1727096132.89969: variable 'port2_profile' from source: play vars 11792 1727096132.90033: variable 'port2_profile' from source: play vars 11792 1727096132.90046: variable 'dhcp_interface2' from source: play vars 11792 1727096132.90112: variable 'dhcp_interface2' from source: play vars 11792 1727096132.90124: variable 'controller_profile' from source: play vars 11792 1727096132.90200: variable 'controller_profile' from source: play vars 11792 1727096132.90214: variable 'network_state' from source: role '' defaults 11792 1727096132.90291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096132.90477: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096132.90519: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096132.90574: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096132.90682: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096132.90685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096132.90712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096132.90743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.90779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096132.90828: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11792 1727096132.90837: when evaluation is False, skipping this task 11792 1727096132.90845: _execute() done 11792 1727096132.90857: dumping result to json 11792 1727096132.90865: done dumping result, returning 11792 1727096132.90879: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-d9c7-3fc0-00000000027b] 11792 1727096132.90889: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027b skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11792 1727096132.91046: no more pending results, returning what we have 11792 1727096132.91049: results queue empty 11792 1727096132.91052: checking for any_errors_fatal 11792 1727096132.91058: done checking for any_errors_fatal 11792 1727096132.91059: checking for max_fail_percentage 11792 1727096132.91061: done checking for max_fail_percentage 11792 1727096132.91061: checking to see if all hosts have failed and the running result is not ok 11792 1727096132.91062: done checking to see if all hosts have failed 11792 1727096132.91063: getting the remaining hosts for this loop 11792 1727096132.91064: done getting the remaining hosts for this loop 11792 1727096132.91070: getting the next task for host managed_node2 11792 1727096132.91078: done getting next task for host managed_node2 11792 1727096132.91081: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11792 1727096132.91086: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096132.91100: getting variables 11792 1727096132.91101: in VariableManager get_vars() 11792 1727096132.91136: Calling all_inventory to load vars for managed_node2 11792 1727096132.91139: Calling groups_inventory to load vars for managed_node2 11792 1727096132.91142: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096132.91157: Calling all_plugins_play to load vars for managed_node2 11792 1727096132.91160: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096132.91164: Calling groups_plugins_play to load vars for managed_node2 11792 1727096132.92182: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027b 11792 1727096132.92186: WORKER PROCESS EXITING 11792 1727096132.92847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096132.94350: done with get_vars() 11792 1727096132.94372: done getting variables 11792 1727096132.94448: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:32 -0400 (0:00:00.105) 0:00:15.223 ****** 11792 1727096132.94475: entering _queue_task() for managed_node2/dnf 11792 1727096132.94719: worker is 1 (out of 1 available) 11792 1727096132.94732: exiting _queue_task() for managed_node2/dnf 11792 1727096132.94745: done queuing things up, now waiting for results queue to drain 11792 1727096132.94747: waiting for pending results... 11792 1727096132.94923: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11792 1727096132.95013: in run() - task 0afff68d-5257-d9c7-3fc0-00000000027c 11792 1727096132.95023: variable 'ansible_search_path' from source: unknown 11792 1727096132.95027: variable 'ansible_search_path' from source: unknown 11792 1727096132.95057: calling self._execute() 11792 1727096132.95123: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096132.95129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096132.95136: variable 'omit' from source: magic vars 11792 1727096132.95412: variable 'ansible_distribution_major_version' from source: facts 11792 1727096132.95421: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096132.95559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096132.97505: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096132.97559: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096132.97593: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096132.97618: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096132.97639: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096132.97702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096132.97723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096132.97740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.97771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096132.97782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096132.97871: variable 'ansible_distribution' from source: facts 11792 1727096132.97874: variable 'ansible_distribution_major_version' from source: facts 11792 1727096132.97887: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11792 1727096132.97972: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096132.98059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096132.98078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096132.98095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.98119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096132.98132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096132.98163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096132.98180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096132.98196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.98220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096132.98232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096132.98263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096132.98282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096132.98298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.98321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096132.98331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096132.98434: variable 'network_connections' from source: include params 11792 1727096132.98443: variable 'controller_profile' from source: play vars 11792 1727096132.98493: variable 'controller_profile' from source: play vars 11792 1727096132.98502: variable 'controller_device' from source: play vars 11792 1727096132.98543: variable 'controller_device' from source: play vars 11792 1727096132.98557: variable 'port1_profile' from source: play vars 11792 1727096132.98601: variable 'port1_profile' from source: play vars 11792 1727096132.98608: variable 'dhcp_interface1' from source: play vars 11792 1727096132.98650: variable 'dhcp_interface1' from source: play vars 11792 1727096132.98659: variable 'controller_profile' from source: play vars 11792 1727096132.98703: variable 'controller_profile' from source: play vars 11792 1727096132.98709: variable 'port2_profile' from source: play vars 11792 1727096132.98751: variable 'port2_profile' from source: play vars 11792 1727096132.98759: variable 'dhcp_interface2' from source: play vars 11792 1727096132.98804: variable 'dhcp_interface2' from source: play vars 11792 1727096132.98811: variable 'controller_profile' from source: play vars 11792 1727096132.98851: variable 'controller_profile' from source: play vars 11792 1727096132.98911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096132.99048: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096132.99105: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096132.99138: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096132.99192: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096132.99195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096132.99229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096132.99386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096132.99389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096132.99391: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096132.99574: variable 'network_connections' from source: include params 11792 1727096132.99579: variable 'controller_profile' from source: play vars 11792 1727096132.99636: variable 'controller_profile' from source: play vars 11792 1727096132.99643: variable 'controller_device' from source: play vars 11792 1727096132.99705: variable 'controller_device' from source: play vars 11792 1727096132.99748: variable 'port1_profile' from source: play vars 11792 1727096132.99777: variable 'port1_profile' from source: play vars 11792 1727096132.99784: variable 'dhcp_interface1' from source: play vars 11792 1727096132.99841: variable 'dhcp_interface1' from source: play vars 11792 1727096132.99847: variable 'controller_profile' from source: play vars 11792 1727096132.99910: variable 'controller_profile' from source: play vars 11792 1727096132.99918: variable 'port2_profile' from source: play vars 11792 1727096132.99980: variable 'port2_profile' from source: play vars 11792 1727096133.00056: variable 'dhcp_interface2' from source: play vars 11792 1727096133.00059: variable 'dhcp_interface2' from source: play vars 11792 1727096133.00062: variable 'controller_profile' from source: play vars 11792 1727096133.00107: variable 'controller_profile' from source: play vars 11792 1727096133.00140: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096133.00143: when evaluation is False, skipping this task 11792 1727096133.00145: _execute() done 11792 1727096133.00148: dumping result to json 11792 1727096133.00150: done dumping result, returning 11792 1727096133.00165: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-00000000027c] 11792 1727096133.00169: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027c 11792 1727096133.00259: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027c 11792 1727096133.00262: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096133.00424: no more pending results, returning what we have 11792 1727096133.00427: results queue empty 11792 1727096133.00428: checking for any_errors_fatal 11792 1727096133.00433: done checking for any_errors_fatal 11792 1727096133.00434: checking for max_fail_percentage 11792 1727096133.00436: done checking for max_fail_percentage 11792 1727096133.00437: checking to see if all hosts have failed and the running result is not ok 11792 1727096133.00437: done checking to see if all hosts have failed 11792 1727096133.00438: getting the remaining hosts for this loop 11792 1727096133.00439: done getting the remaining hosts for this loop 11792 1727096133.00442: getting the next task for host managed_node2 11792 1727096133.00449: done getting next task for host managed_node2 11792 1727096133.00452: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11792 1727096133.00457: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096133.00471: getting variables 11792 1727096133.00473: in VariableManager get_vars() 11792 1727096133.00504: Calling all_inventory to load vars for managed_node2 11792 1727096133.00507: Calling groups_inventory to load vars for managed_node2 11792 1727096133.00509: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096133.00517: Calling all_plugins_play to load vars for managed_node2 11792 1727096133.00519: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096133.00522: Calling groups_plugins_play to load vars for managed_node2 11792 1727096133.01466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096133.02329: done with get_vars() 11792 1727096133.02345: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11792 1727096133.02406: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:33 -0400 (0:00:00.079) 0:00:15.303 ****** 11792 1727096133.02430: entering _queue_task() for managed_node2/yum 11792 1727096133.02431: Creating lock for yum 11792 1727096133.02727: worker is 1 (out of 1 available) 11792 1727096133.02740: exiting _queue_task() for managed_node2/yum 11792 1727096133.02758: done queuing things up, now waiting for results queue to drain 11792 1727096133.02760: waiting for pending results... 11792 1727096133.03017: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11792 1727096133.03153: in run() - task 0afff68d-5257-d9c7-3fc0-00000000027d 11792 1727096133.03188: variable 'ansible_search_path' from source: unknown 11792 1727096133.03196: variable 'ansible_search_path' from source: unknown 11792 1727096133.03237: calling self._execute() 11792 1727096133.03376: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096133.03383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096133.03389: variable 'omit' from source: magic vars 11792 1727096133.03773: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.03776: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096133.03877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096133.05655: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096133.05708: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096133.05738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096133.05765: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096133.05787: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096133.05847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.05870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.05888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.05916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.05926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.05999: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.06015: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11792 1727096133.06018: when evaluation is False, skipping this task 11792 1727096133.06021: _execute() done 11792 1727096133.06023: dumping result to json 11792 1727096133.06025: done dumping result, returning 11792 1727096133.06033: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-00000000027d] 11792 1727096133.06036: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027d 11792 1727096133.06127: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027d 11792 1727096133.06130: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11792 1727096133.06209: no more pending results, returning what we have 11792 1727096133.06212: results queue empty 11792 1727096133.06213: checking for any_errors_fatal 11792 1727096133.06219: done checking for any_errors_fatal 11792 1727096133.06220: checking for max_fail_percentage 11792 1727096133.06221: done checking for max_fail_percentage 11792 1727096133.06222: checking to see if all hosts have failed and the running result is not ok 11792 1727096133.06223: done checking to see if all hosts have failed 11792 1727096133.06223: getting the remaining hosts for this loop 11792 1727096133.06225: done getting the remaining hosts for this loop 11792 1727096133.06229: getting the next task for host managed_node2 11792 1727096133.06237: done getting next task for host managed_node2 11792 1727096133.06240: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11792 1727096133.06246: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096133.06264: getting variables 11792 1727096133.06265: in VariableManager get_vars() 11792 1727096133.06299: Calling all_inventory to load vars for managed_node2 11792 1727096133.06302: Calling groups_inventory to load vars for managed_node2 11792 1727096133.06304: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096133.06311: Calling all_plugins_play to load vars for managed_node2 11792 1727096133.06314: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096133.06316: Calling groups_plugins_play to load vars for managed_node2 11792 1727096133.07635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096133.08545: done with get_vars() 11792 1727096133.08565: done getting variables 11792 1727096133.08612: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:33 -0400 (0:00:00.062) 0:00:15.365 ****** 11792 1727096133.08637: entering _queue_task() for managed_node2/fail 11792 1727096133.08883: worker is 1 (out of 1 available) 11792 1727096133.08896: exiting _queue_task() for managed_node2/fail 11792 1727096133.08910: done queuing things up, now waiting for results queue to drain 11792 1727096133.08911: waiting for pending results... 11792 1727096133.09098: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11792 1727096133.09193: in run() - task 0afff68d-5257-d9c7-3fc0-00000000027e 11792 1727096133.09207: variable 'ansible_search_path' from source: unknown 11792 1727096133.09210: variable 'ansible_search_path' from source: unknown 11792 1727096133.09237: calling self._execute() 11792 1727096133.09306: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096133.09311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096133.09320: variable 'omit' from source: magic vars 11792 1727096133.09594: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.09603: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096133.09685: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096133.09819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096133.11674: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096133.11732: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096133.11762: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096133.11791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096133.11815: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096133.11876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.11897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.11920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.11946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.11959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.11998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.12021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.12034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.12061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.12073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.12101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.12116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.12136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.12163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.12175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.12293: variable 'network_connections' from source: include params 11792 1727096133.12303: variable 'controller_profile' from source: play vars 11792 1727096133.12353: variable 'controller_profile' from source: play vars 11792 1727096133.12366: variable 'controller_device' from source: play vars 11792 1727096133.12409: variable 'controller_device' from source: play vars 11792 1727096133.12421: variable 'port1_profile' from source: play vars 11792 1727096133.12466: variable 'port1_profile' from source: play vars 11792 1727096133.12475: variable 'dhcp_interface1' from source: play vars 11792 1727096133.12516: variable 'dhcp_interface1' from source: play vars 11792 1727096133.12522: variable 'controller_profile' from source: play vars 11792 1727096133.12566: variable 'controller_profile' from source: play vars 11792 1727096133.12574: variable 'port2_profile' from source: play vars 11792 1727096133.12615: variable 'port2_profile' from source: play vars 11792 1727096133.12621: variable 'dhcp_interface2' from source: play vars 11792 1727096133.12669: variable 'dhcp_interface2' from source: play vars 11792 1727096133.12672: variable 'controller_profile' from source: play vars 11792 1727096133.12716: variable 'controller_profile' from source: play vars 11792 1727096133.12766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096133.12971: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096133.12974: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096133.12977: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096133.13010: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096133.13042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096133.13375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096133.13378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.13381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096133.13391: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096133.13433: variable 'network_connections' from source: include params 11792 1727096133.13436: variable 'controller_profile' from source: play vars 11792 1727096133.13494: variable 'controller_profile' from source: play vars 11792 1727096133.13501: variable 'controller_device' from source: play vars 11792 1727096133.13592: variable 'controller_device' from source: play vars 11792 1727096133.13596: variable 'port1_profile' from source: play vars 11792 1727096133.13620: variable 'port1_profile' from source: play vars 11792 1727096133.13627: variable 'dhcp_interface1' from source: play vars 11792 1727096133.13876: variable 'dhcp_interface1' from source: play vars 11792 1727096133.13879: variable 'controller_profile' from source: play vars 11792 1727096133.13882: variable 'controller_profile' from source: play vars 11792 1727096133.13884: variable 'port2_profile' from source: play vars 11792 1727096133.13886: variable 'port2_profile' from source: play vars 11792 1727096133.13888: variable 'dhcp_interface2' from source: play vars 11792 1727096133.13890: variable 'dhcp_interface2' from source: play vars 11792 1727096133.13892: variable 'controller_profile' from source: play vars 11792 1727096133.13928: variable 'controller_profile' from source: play vars 11792 1727096133.13963: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096133.13967: when evaluation is False, skipping this task 11792 1727096133.13971: _execute() done 11792 1727096133.13973: dumping result to json 11792 1727096133.13975: done dumping result, returning 11792 1727096133.13985: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-00000000027e] 11792 1727096133.13988: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027e 11792 1727096133.14075: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027e 11792 1727096133.14078: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096133.14132: no more pending results, returning what we have 11792 1727096133.14135: results queue empty 11792 1727096133.14136: checking for any_errors_fatal 11792 1727096133.14141: done checking for any_errors_fatal 11792 1727096133.14142: checking for max_fail_percentage 11792 1727096133.14144: done checking for max_fail_percentage 11792 1727096133.14144: checking to see if all hosts have failed and the running result is not ok 11792 1727096133.14145: done checking to see if all hosts have failed 11792 1727096133.14146: getting the remaining hosts for this loop 11792 1727096133.14147: done getting the remaining hosts for this loop 11792 1727096133.14151: getting the next task for host managed_node2 11792 1727096133.14158: done getting next task for host managed_node2 11792 1727096133.14161: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11792 1727096133.14167: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096133.14182: getting variables 11792 1727096133.14184: in VariableManager get_vars() 11792 1727096133.14218: Calling all_inventory to load vars for managed_node2 11792 1727096133.14220: Calling groups_inventory to load vars for managed_node2 11792 1727096133.14222: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096133.14233: Calling all_plugins_play to load vars for managed_node2 11792 1727096133.14235: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096133.14238: Calling groups_plugins_play to load vars for managed_node2 11792 1727096133.15618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096133.17266: done with get_vars() 11792 1727096133.17301: done getting variables 11792 1727096133.17371: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:33 -0400 (0:00:00.087) 0:00:15.453 ****** 11792 1727096133.17408: entering _queue_task() for managed_node2/package 11792 1727096133.17761: worker is 1 (out of 1 available) 11792 1727096133.17879: exiting _queue_task() for managed_node2/package 11792 1727096133.17893: done queuing things up, now waiting for results queue to drain 11792 1727096133.17895: waiting for pending results... 11792 1727096133.18298: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11792 1727096133.18303: in run() - task 0afff68d-5257-d9c7-3fc0-00000000027f 11792 1727096133.18307: variable 'ansible_search_path' from source: unknown 11792 1727096133.18311: variable 'ansible_search_path' from source: unknown 11792 1727096133.18314: calling self._execute() 11792 1727096133.18403: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096133.18410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096133.18418: variable 'omit' from source: magic vars 11792 1727096133.18813: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.18825: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096133.19033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096133.19311: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096133.19352: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096133.19395: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096133.19434: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096133.19580: variable 'network_packages' from source: role '' defaults 11792 1727096133.19646: variable '__network_provider_setup' from source: role '' defaults 11792 1727096133.19661: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096133.19725: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096133.19737: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096133.19820: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096133.19989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096133.27239: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096133.27309: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096133.27353: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096133.27387: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096133.27434: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096133.27513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.27540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.27570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.27674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.27677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.27688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.27862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.27865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.27870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.27873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.28236: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11792 1727096133.28464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.28596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.28623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.28663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.28679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.28771: variable 'ansible_python' from source: facts 11792 1727096133.28950: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11792 1727096133.29038: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096133.29251: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096133.29402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.29427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.29450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.29524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.29527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.29553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.29580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.29632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.29640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.29656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.29800: variable 'network_connections' from source: include params 11792 1727096133.29815: variable 'controller_profile' from source: play vars 11792 1727096133.29909: variable 'controller_profile' from source: play vars 11792 1727096133.29959: variable 'controller_device' from source: play vars 11792 1727096133.30015: variable 'controller_device' from source: play vars 11792 1727096133.30031: variable 'port1_profile' from source: play vars 11792 1727096133.30127: variable 'port1_profile' from source: play vars 11792 1727096133.30141: variable 'dhcp_interface1' from source: play vars 11792 1727096133.30251: variable 'dhcp_interface1' from source: play vars 11792 1727096133.30254: variable 'controller_profile' from source: play vars 11792 1727096133.30342: variable 'controller_profile' from source: play vars 11792 1727096133.30360: variable 'port2_profile' from source: play vars 11792 1727096133.30470: variable 'port2_profile' from source: play vars 11792 1727096133.30473: variable 'dhcp_interface2' from source: play vars 11792 1727096133.30553: variable 'dhcp_interface2' from source: play vars 11792 1727096133.30565: variable 'controller_profile' from source: play vars 11792 1727096133.30682: variable 'controller_profile' from source: play vars 11792 1727096133.30730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096133.30760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096133.30827: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.30831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096133.30865: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096133.31113: variable 'network_connections' from source: include params 11792 1727096133.31116: variable 'controller_profile' from source: play vars 11792 1727096133.31469: variable 'controller_profile' from source: play vars 11792 1727096133.31472: variable 'controller_device' from source: play vars 11792 1727096133.31474: variable 'controller_device' from source: play vars 11792 1727096133.31476: variable 'port1_profile' from source: play vars 11792 1727096133.31478: variable 'port1_profile' from source: play vars 11792 1727096133.31480: variable 'dhcp_interface1' from source: play vars 11792 1727096133.31633: variable 'dhcp_interface1' from source: play vars 11792 1727096133.31642: variable 'controller_profile' from source: play vars 11792 1727096133.32120: variable 'controller_profile' from source: play vars 11792 1727096133.32129: variable 'port2_profile' from source: play vars 11792 1727096133.32230: variable 'port2_profile' from source: play vars 11792 1727096133.32239: variable 'dhcp_interface2' from source: play vars 11792 1727096133.33031: variable 'dhcp_interface2' from source: play vars 11792 1727096133.33040: variable 'controller_profile' from source: play vars 11792 1727096133.33137: variable 'controller_profile' from source: play vars 11792 1727096133.33194: variable '__network_packages_default_wireless' from source: role '' defaults 11792 1727096133.33265: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096133.33989: variable 'network_connections' from source: include params 11792 1727096133.33993: variable 'controller_profile' from source: play vars 11792 1727096133.34060: variable 'controller_profile' from source: play vars 11792 1727096133.34065: variable 'controller_device' from source: play vars 11792 1727096133.34332: variable 'controller_device' from source: play vars 11792 1727096133.34343: variable 'port1_profile' from source: play vars 11792 1727096133.34409: variable 'port1_profile' from source: play vars 11792 1727096133.34416: variable 'dhcp_interface1' from source: play vars 11792 1727096133.34481: variable 'dhcp_interface1' from source: play vars 11792 1727096133.34487: variable 'controller_profile' from source: play vars 11792 1727096133.34547: variable 'controller_profile' from source: play vars 11792 1727096133.34557: variable 'port2_profile' from source: play vars 11792 1727096133.34825: variable 'port2_profile' from source: play vars 11792 1727096133.34828: variable 'dhcp_interface2' from source: play vars 11792 1727096133.35098: variable 'dhcp_interface2' from source: play vars 11792 1727096133.35104: variable 'controller_profile' from source: play vars 11792 1727096133.35169: variable 'controller_profile' from source: play vars 11792 1727096133.35197: variable '__network_packages_default_team' from source: role '' defaults 11792 1727096133.35275: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096133.36006: variable 'network_connections' from source: include params 11792 1727096133.36012: variable 'controller_profile' from source: play vars 11792 1727096133.36076: variable 'controller_profile' from source: play vars 11792 1727096133.36083: variable 'controller_device' from source: play vars 11792 1727096133.36143: variable 'controller_device' from source: play vars 11792 1727096133.36157: variable 'port1_profile' from source: play vars 11792 1727096133.36420: variable 'port1_profile' from source: play vars 11792 1727096133.36427: variable 'dhcp_interface1' from source: play vars 11792 1727096133.36690: variable 'dhcp_interface1' from source: play vars 11792 1727096133.36695: variable 'controller_profile' from source: play vars 11792 1727096133.36751: variable 'controller_profile' from source: play vars 11792 1727096133.36782: variable 'port2_profile' from source: play vars 11792 1727096133.36814: variable 'port2_profile' from source: play vars 11792 1727096133.36821: variable 'dhcp_interface2' from source: play vars 11792 1727096133.37125: variable 'dhcp_interface2' from source: play vars 11792 1727096133.37128: variable 'controller_profile' from source: play vars 11792 1727096133.37147: variable 'controller_profile' from source: play vars 11792 1727096133.37422: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096133.37485: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096133.37491: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096133.37550: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096133.37972: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11792 1727096133.39549: variable 'network_connections' from source: include params 11792 1727096133.39552: variable 'controller_profile' from source: play vars 11792 1727096133.39621: variable 'controller_profile' from source: play vars 11792 1727096133.39629: variable 'controller_device' from source: play vars 11792 1727096133.40091: variable 'controller_device' from source: play vars 11792 1727096133.40102: variable 'port1_profile' from source: play vars 11792 1727096133.40161: variable 'port1_profile' from source: play vars 11792 1727096133.40170: variable 'dhcp_interface1' from source: play vars 11792 1727096133.40228: variable 'dhcp_interface1' from source: play vars 11792 1727096133.40235: variable 'controller_profile' from source: play vars 11792 1727096133.40973: variable 'controller_profile' from source: play vars 11792 1727096133.40977: variable 'port2_profile' from source: play vars 11792 1727096133.40979: variable 'port2_profile' from source: play vars 11792 1727096133.40982: variable 'dhcp_interface2' from source: play vars 11792 1727096133.40983: variable 'dhcp_interface2' from source: play vars 11792 1727096133.40985: variable 'controller_profile' from source: play vars 11792 1727096133.40987: variable 'controller_profile' from source: play vars 11792 1727096133.41179: variable 'ansible_distribution' from source: facts 11792 1727096133.41182: variable '__network_rh_distros' from source: role '' defaults 11792 1727096133.41189: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.41216: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11792 1727096133.41501: variable 'ansible_distribution' from source: facts 11792 1727096133.41504: variable '__network_rh_distros' from source: role '' defaults 11792 1727096133.41510: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.41522: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11792 1727096133.41901: variable 'ansible_distribution' from source: facts 11792 1727096133.41904: variable '__network_rh_distros' from source: role '' defaults 11792 1727096133.41907: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.41909: variable 'network_provider' from source: set_fact 11792 1727096133.42015: variable 'ansible_facts' from source: unknown 11792 1727096133.43469: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11792 1727096133.43473: when evaluation is False, skipping this task 11792 1727096133.43476: _execute() done 11792 1727096133.43478: dumping result to json 11792 1727096133.43480: done dumping result, returning 11792 1727096133.43482: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-d9c7-3fc0-00000000027f] 11792 1727096133.43825: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027f 11792 1727096133.43892: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000027f 11792 1727096133.43896: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11792 1727096133.43974: no more pending results, returning what we have 11792 1727096133.43977: results queue empty 11792 1727096133.43978: checking for any_errors_fatal 11792 1727096133.43983: done checking for any_errors_fatal 11792 1727096133.43984: checking for max_fail_percentage 11792 1727096133.43986: done checking for max_fail_percentage 11792 1727096133.43987: checking to see if all hosts have failed and the running result is not ok 11792 1727096133.43987: done checking to see if all hosts have failed 11792 1727096133.43988: getting the remaining hosts for this loop 11792 1727096133.43990: done getting the remaining hosts for this loop 11792 1727096133.43994: getting the next task for host managed_node2 11792 1727096133.44001: done getting next task for host managed_node2 11792 1727096133.44004: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11792 1727096133.44010: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096133.44024: getting variables 11792 1727096133.44025: in VariableManager get_vars() 11792 1727096133.44059: Calling all_inventory to load vars for managed_node2 11792 1727096133.44062: Calling groups_inventory to load vars for managed_node2 11792 1727096133.44064: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096133.44075: Calling all_plugins_play to load vars for managed_node2 11792 1727096133.44077: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096133.44080: Calling groups_plugins_play to load vars for managed_node2 11792 1727096133.55490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096133.57543: done with get_vars() 11792 1727096133.57577: done getting variables 11792 1727096133.57626: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:33 -0400 (0:00:00.402) 0:00:15.855 ****** 11792 1727096133.57662: entering _queue_task() for managed_node2/package 11792 1727096133.58034: worker is 1 (out of 1 available) 11792 1727096133.58046: exiting _queue_task() for managed_node2/package 11792 1727096133.58059: done queuing things up, now waiting for results queue to drain 11792 1727096133.58061: waiting for pending results... 11792 1727096133.58410: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11792 1727096133.58538: in run() - task 0afff68d-5257-d9c7-3fc0-000000000280 11792 1727096133.58590: variable 'ansible_search_path' from source: unknown 11792 1727096133.58594: variable 'ansible_search_path' from source: unknown 11792 1727096133.58633: calling self._execute() 11792 1727096133.58764: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096133.58834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096133.58838: variable 'omit' from source: magic vars 11792 1727096133.59226: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.59243: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096133.59367: variable 'network_state' from source: role '' defaults 11792 1727096133.59392: Evaluated conditional (network_state != {}): False 11792 1727096133.59400: when evaluation is False, skipping this task 11792 1727096133.59408: _execute() done 11792 1727096133.59416: dumping result to json 11792 1727096133.59424: done dumping result, returning 11792 1727096133.59459: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-d9c7-3fc0-000000000280] 11792 1727096133.59484: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000280 11792 1727096133.59744: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000280 11792 1727096133.59747: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096133.59800: no more pending results, returning what we have 11792 1727096133.59804: results queue empty 11792 1727096133.59805: checking for any_errors_fatal 11792 1727096133.59814: done checking for any_errors_fatal 11792 1727096133.59815: checking for max_fail_percentage 11792 1727096133.59817: done checking for max_fail_percentage 11792 1727096133.59818: checking to see if all hosts have failed and the running result is not ok 11792 1727096133.59819: done checking to see if all hosts have failed 11792 1727096133.59820: getting the remaining hosts for this loop 11792 1727096133.59821: done getting the remaining hosts for this loop 11792 1727096133.59825: getting the next task for host managed_node2 11792 1727096133.59835: done getting next task for host managed_node2 11792 1727096133.59840: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11792 1727096133.59848: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096133.59871: getting variables 11792 1727096133.59874: in VariableManager get_vars() 11792 1727096133.59912: Calling all_inventory to load vars for managed_node2 11792 1727096133.59915: Calling groups_inventory to load vars for managed_node2 11792 1727096133.59917: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096133.59929: Calling all_plugins_play to load vars for managed_node2 11792 1727096133.59931: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096133.59933: Calling groups_plugins_play to load vars for managed_node2 11792 1727096133.62752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096133.65425: done with get_vars() 11792 1727096133.65455: done getting variables 11792 1727096133.65521: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:33 -0400 (0:00:00.078) 0:00:15.934 ****** 11792 1727096133.65557: entering _queue_task() for managed_node2/package 11792 1727096133.66090: worker is 1 (out of 1 available) 11792 1727096133.66100: exiting _queue_task() for managed_node2/package 11792 1727096133.66110: done queuing things up, now waiting for results queue to drain 11792 1727096133.66112: waiting for pending results... 11792 1727096133.66352: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11792 1727096133.66425: in run() - task 0afff68d-5257-d9c7-3fc0-000000000281 11792 1727096133.66454: variable 'ansible_search_path' from source: unknown 11792 1727096133.66464: variable 'ansible_search_path' from source: unknown 11792 1727096133.66511: calling self._execute() 11792 1727096133.66618: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096133.66632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096133.66647: variable 'omit' from source: magic vars 11792 1727096133.67396: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.67415: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096133.67545: variable 'network_state' from source: role '' defaults 11792 1727096133.67648: Evaluated conditional (network_state != {}): False 11792 1727096133.67651: when evaluation is False, skipping this task 11792 1727096133.67653: _execute() done 11792 1727096133.67656: dumping result to json 11792 1727096133.67658: done dumping result, returning 11792 1727096133.67660: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-d9c7-3fc0-000000000281] 11792 1727096133.67662: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000281 11792 1727096133.67737: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000281 11792 1727096133.67740: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096133.67798: no more pending results, returning what we have 11792 1727096133.67802: results queue empty 11792 1727096133.67803: checking for any_errors_fatal 11792 1727096133.67812: done checking for any_errors_fatal 11792 1727096133.67813: checking for max_fail_percentage 11792 1727096133.67815: done checking for max_fail_percentage 11792 1727096133.67816: checking to see if all hosts have failed and the running result is not ok 11792 1727096133.67816: done checking to see if all hosts have failed 11792 1727096133.67817: getting the remaining hosts for this loop 11792 1727096133.67819: done getting the remaining hosts for this loop 11792 1727096133.67823: getting the next task for host managed_node2 11792 1727096133.67832: done getting next task for host managed_node2 11792 1727096133.67835: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11792 1727096133.67843: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096133.67860: getting variables 11792 1727096133.67862: in VariableManager get_vars() 11792 1727096133.68002: Calling all_inventory to load vars for managed_node2 11792 1727096133.68005: Calling groups_inventory to load vars for managed_node2 11792 1727096133.68008: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096133.68021: Calling all_plugins_play to load vars for managed_node2 11792 1727096133.68024: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096133.68028: Calling groups_plugins_play to load vars for managed_node2 11792 1727096133.69880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096133.72000: done with get_vars() 11792 1727096133.72028: done getting variables 11792 1727096133.72338: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:33 -0400 (0:00:00.068) 0:00:16.002 ****** 11792 1727096133.72374: entering _queue_task() for managed_node2/service 11792 1727096133.72375: Creating lock for service 11792 1727096133.73131: worker is 1 (out of 1 available) 11792 1727096133.73144: exiting _queue_task() for managed_node2/service 11792 1727096133.73157: done queuing things up, now waiting for results queue to drain 11792 1727096133.73159: waiting for pending results... 11792 1727096133.73511: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11792 1727096133.73710: in run() - task 0afff68d-5257-d9c7-3fc0-000000000282 11792 1727096133.73715: variable 'ansible_search_path' from source: unknown 11792 1727096133.73717: variable 'ansible_search_path' from source: unknown 11792 1727096133.73720: calling self._execute() 11792 1727096133.73793: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096133.73835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096133.73928: variable 'omit' from source: magic vars 11792 1727096133.74316: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.74342: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096133.74510: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096133.74728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096133.78732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096133.78834: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096133.78882: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096133.78929: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096133.78964: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096133.79057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.79093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.79141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.79176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.79195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.79372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.79376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.79379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.79382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.79384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.79408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.79436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.79470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.79516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.79533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.79719: variable 'network_connections' from source: include params 11792 1727096133.79824: variable 'controller_profile' from source: play vars 11792 1727096133.79828: variable 'controller_profile' from source: play vars 11792 1727096133.79841: variable 'controller_device' from source: play vars 11792 1727096133.79910: variable 'controller_device' from source: play vars 11792 1727096133.79936: variable 'port1_profile' from source: play vars 11792 1727096133.80000: variable 'port1_profile' from source: play vars 11792 1727096133.80013: variable 'dhcp_interface1' from source: play vars 11792 1727096133.80083: variable 'dhcp_interface1' from source: play vars 11792 1727096133.80095: variable 'controller_profile' from source: play vars 11792 1727096133.80161: variable 'controller_profile' from source: play vars 11792 1727096133.80175: variable 'port2_profile' from source: play vars 11792 1727096133.80230: variable 'port2_profile' from source: play vars 11792 1727096133.80240: variable 'dhcp_interface2' from source: play vars 11792 1727096133.80301: variable 'dhcp_interface2' from source: play vars 11792 1727096133.80312: variable 'controller_profile' from source: play vars 11792 1727096133.80374: variable 'controller_profile' from source: play vars 11792 1727096133.80449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096133.80812: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096133.80815: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096133.80844: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096133.80883: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096133.80940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096133.80971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096133.81002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.81359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096133.81374: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096133.82007: variable 'network_connections' from source: include params 11792 1727096133.82092: variable 'controller_profile' from source: play vars 11792 1727096133.82162: variable 'controller_profile' from source: play vars 11792 1727096133.82255: variable 'controller_device' from source: play vars 11792 1727096133.82433: variable 'controller_device' from source: play vars 11792 1727096133.82453: variable 'port1_profile' from source: play vars 11792 1727096133.82611: variable 'port1_profile' from source: play vars 11792 1727096133.82636: variable 'dhcp_interface1' from source: play vars 11792 1727096133.82712: variable 'dhcp_interface1' from source: play vars 11792 1727096133.82723: variable 'controller_profile' from source: play vars 11792 1727096133.82793: variable 'controller_profile' from source: play vars 11792 1727096133.82805: variable 'port2_profile' from source: play vars 11792 1727096133.82874: variable 'port2_profile' from source: play vars 11792 1727096133.82890: variable 'dhcp_interface2' from source: play vars 11792 1727096133.82958: variable 'dhcp_interface2' from source: play vars 11792 1727096133.82973: variable 'controller_profile' from source: play vars 11792 1727096133.83059: variable 'controller_profile' from source: play vars 11792 1727096133.83086: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096133.83093: when evaluation is False, skipping this task 11792 1727096133.83100: _execute() done 11792 1727096133.83106: dumping result to json 11792 1727096133.83169: done dumping result, returning 11792 1727096133.83173: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000282] 11792 1727096133.83175: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000282 11792 1727096133.83255: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000282 11792 1727096133.83259: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096133.83318: no more pending results, returning what we have 11792 1727096133.83322: results queue empty 11792 1727096133.83323: checking for any_errors_fatal 11792 1727096133.83332: done checking for any_errors_fatal 11792 1727096133.83333: checking for max_fail_percentage 11792 1727096133.83334: done checking for max_fail_percentage 11792 1727096133.83335: checking to see if all hosts have failed and the running result is not ok 11792 1727096133.83336: done checking to see if all hosts have failed 11792 1727096133.83336: getting the remaining hosts for this loop 11792 1727096133.83338: done getting the remaining hosts for this loop 11792 1727096133.83342: getting the next task for host managed_node2 11792 1727096133.83352: done getting next task for host managed_node2 11792 1727096133.83356: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11792 1727096133.83362: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096133.83379: getting variables 11792 1727096133.83380: in VariableManager get_vars() 11792 1727096133.83418: Calling all_inventory to load vars for managed_node2 11792 1727096133.83421: Calling groups_inventory to load vars for managed_node2 11792 1727096133.83423: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096133.83435: Calling all_plugins_play to load vars for managed_node2 11792 1727096133.83438: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096133.83441: Calling groups_plugins_play to load vars for managed_node2 11792 1727096133.85092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096133.86813: done with get_vars() 11792 1727096133.86836: done getting variables 11792 1727096133.86901: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:33 -0400 (0:00:00.145) 0:00:16.148 ****** 11792 1727096133.86933: entering _queue_task() for managed_node2/service 11792 1727096133.87278: worker is 1 (out of 1 available) 11792 1727096133.87289: exiting _queue_task() for managed_node2/service 11792 1727096133.87301: done queuing things up, now waiting for results queue to drain 11792 1727096133.87303: waiting for pending results... 11792 1727096133.87690: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11792 1727096133.87735: in run() - task 0afff68d-5257-d9c7-3fc0-000000000283 11792 1727096133.87759: variable 'ansible_search_path' from source: unknown 11792 1727096133.87767: variable 'ansible_search_path' from source: unknown 11792 1727096133.87813: calling self._execute() 11792 1727096133.87915: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096133.87927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096133.87939: variable 'omit' from source: magic vars 11792 1727096133.88319: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.88341: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096133.88509: variable 'network_provider' from source: set_fact 11792 1727096133.88518: variable 'network_state' from source: role '' defaults 11792 1727096133.88532: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11792 1727096133.88547: variable 'omit' from source: magic vars 11792 1727096133.88617: variable 'omit' from source: magic vars 11792 1727096133.88649: variable 'network_service_name' from source: role '' defaults 11792 1727096133.88724: variable 'network_service_name' from source: role '' defaults 11792 1727096133.88835: variable '__network_provider_setup' from source: role '' defaults 11792 1727096133.88890: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096133.88916: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096133.88930: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096133.89001: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096133.89235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096133.91477: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096133.91620: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096133.91624: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096133.91636: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096133.91663: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096133.91748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.91781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.91817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.91872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.91876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.92096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.92100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.92102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.92105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.92107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.92245: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11792 1727096133.92433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.92465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.92498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.92538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.92558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.92658: variable 'ansible_python' from source: facts 11792 1727096133.92681: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11792 1727096133.92773: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096133.92862: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096133.92999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.93033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.93066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.93110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.93134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.93188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096133.93372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096133.93375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.93378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096133.93380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096133.93463: variable 'network_connections' from source: include params 11792 1727096133.93478: variable 'controller_profile' from source: play vars 11792 1727096133.93561: variable 'controller_profile' from source: play vars 11792 1727096133.93582: variable 'controller_device' from source: play vars 11792 1727096133.93661: variable 'controller_device' from source: play vars 11792 1727096133.93686: variable 'port1_profile' from source: play vars 11792 1727096133.93766: variable 'port1_profile' from source: play vars 11792 1727096133.93785: variable 'dhcp_interface1' from source: play vars 11792 1727096133.93866: variable 'dhcp_interface1' from source: play vars 11792 1727096133.93884: variable 'controller_profile' from source: play vars 11792 1727096133.93963: variable 'controller_profile' from source: play vars 11792 1727096133.93981: variable 'port2_profile' from source: play vars 11792 1727096133.94061: variable 'port2_profile' from source: play vars 11792 1727096133.94079: variable 'dhcp_interface2' from source: play vars 11792 1727096133.94158: variable 'dhcp_interface2' from source: play vars 11792 1727096133.94176: variable 'controller_profile' from source: play vars 11792 1727096133.94365: variable 'controller_profile' from source: play vars 11792 1727096133.94371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096133.94586: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096133.94639: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096133.94691: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096133.94737: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096133.94810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096133.94843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096133.94886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096133.94927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096133.94985: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096133.95289: variable 'network_connections' from source: include params 11792 1727096133.95354: variable 'controller_profile' from source: play vars 11792 1727096133.95385: variable 'controller_profile' from source: play vars 11792 1727096133.95401: variable 'controller_device' from source: play vars 11792 1727096133.95483: variable 'controller_device' from source: play vars 11792 1727096133.95500: variable 'port1_profile' from source: play vars 11792 1727096133.95570: variable 'port1_profile' from source: play vars 11792 1727096133.95584: variable 'dhcp_interface1' from source: play vars 11792 1727096133.95648: variable 'dhcp_interface1' from source: play vars 11792 1727096133.95664: variable 'controller_profile' from source: play vars 11792 1727096133.95743: variable 'controller_profile' from source: play vars 11792 1727096133.95784: variable 'port2_profile' from source: play vars 11792 1727096133.95847: variable 'port2_profile' from source: play vars 11792 1727096133.95892: variable 'dhcp_interface2' from source: play vars 11792 1727096133.95954: variable 'dhcp_interface2' from source: play vars 11792 1727096133.95973: variable 'controller_profile' from source: play vars 11792 1727096133.96109: variable 'controller_profile' from source: play vars 11792 1727096133.96132: variable '__network_packages_default_wireless' from source: role '' defaults 11792 1727096133.96255: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096133.96593: variable 'network_connections' from source: include params 11792 1727096133.96604: variable 'controller_profile' from source: play vars 11792 1727096133.96685: variable 'controller_profile' from source: play vars 11792 1727096133.96723: variable 'controller_device' from source: play vars 11792 1727096133.96807: variable 'controller_device' from source: play vars 11792 1727096133.96870: variable 'port1_profile' from source: play vars 11792 1727096133.96914: variable 'port1_profile' from source: play vars 11792 1727096133.96927: variable 'dhcp_interface1' from source: play vars 11792 1727096133.97019: variable 'dhcp_interface1' from source: play vars 11792 1727096133.97032: variable 'controller_profile' from source: play vars 11792 1727096133.97136: variable 'controller_profile' from source: play vars 11792 1727096133.97162: variable 'port2_profile' from source: play vars 11792 1727096133.97298: variable 'port2_profile' from source: play vars 11792 1727096133.97301: variable 'dhcp_interface2' from source: play vars 11792 1727096133.97370: variable 'dhcp_interface2' from source: play vars 11792 1727096133.97384: variable 'controller_profile' from source: play vars 11792 1727096133.97462: variable 'controller_profile' from source: play vars 11792 1727096133.97497: variable '__network_packages_default_team' from source: role '' defaults 11792 1727096133.97610: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096133.97937: variable 'network_connections' from source: include params 11792 1727096133.97973: variable 'controller_profile' from source: play vars 11792 1727096133.98025: variable 'controller_profile' from source: play vars 11792 1727096133.98038: variable 'controller_device' from source: play vars 11792 1727096133.98116: variable 'controller_device' from source: play vars 11792 1727096133.98165: variable 'port1_profile' from source: play vars 11792 1727096133.98210: variable 'port1_profile' from source: play vars 11792 1727096133.98222: variable 'dhcp_interface1' from source: play vars 11792 1727096133.98301: variable 'dhcp_interface1' from source: play vars 11792 1727096133.98314: variable 'controller_profile' from source: play vars 11792 1727096133.98421: variable 'controller_profile' from source: play vars 11792 1727096133.98424: variable 'port2_profile' from source: play vars 11792 1727096133.98507: variable 'port2_profile' from source: play vars 11792 1727096133.98524: variable 'dhcp_interface2' from source: play vars 11792 1727096133.98580: variable 'dhcp_interface2' from source: play vars 11792 1727096133.98585: variable 'controller_profile' from source: play vars 11792 1727096133.98637: variable 'controller_profile' from source: play vars 11792 1727096133.98690: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096133.98748: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096133.98754: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096133.98805: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096133.98946: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11792 1727096133.99249: variable 'network_connections' from source: include params 11792 1727096133.99254: variable 'controller_profile' from source: play vars 11792 1727096133.99300: variable 'controller_profile' from source: play vars 11792 1727096133.99305: variable 'controller_device' from source: play vars 11792 1727096133.99345: variable 'controller_device' from source: play vars 11792 1727096133.99357: variable 'port1_profile' from source: play vars 11792 1727096133.99402: variable 'port1_profile' from source: play vars 11792 1727096133.99408: variable 'dhcp_interface1' from source: play vars 11792 1727096133.99450: variable 'dhcp_interface1' from source: play vars 11792 1727096133.99457: variable 'controller_profile' from source: play vars 11792 1727096133.99503: variable 'controller_profile' from source: play vars 11792 1727096133.99509: variable 'port2_profile' from source: play vars 11792 1727096133.99549: variable 'port2_profile' from source: play vars 11792 1727096133.99557: variable 'dhcp_interface2' from source: play vars 11792 1727096133.99601: variable 'dhcp_interface2' from source: play vars 11792 1727096133.99607: variable 'controller_profile' from source: play vars 11792 1727096133.99647: variable 'controller_profile' from source: play vars 11792 1727096133.99656: variable 'ansible_distribution' from source: facts 11792 1727096133.99659: variable '__network_rh_distros' from source: role '' defaults 11792 1727096133.99665: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.99688: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11792 1727096133.99808: variable 'ansible_distribution' from source: facts 11792 1727096133.99811: variable '__network_rh_distros' from source: role '' defaults 11792 1727096133.99814: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.99821: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11792 1727096133.99933: variable 'ansible_distribution' from source: facts 11792 1727096133.99936: variable '__network_rh_distros' from source: role '' defaults 11792 1727096133.99941: variable 'ansible_distribution_major_version' from source: facts 11792 1727096133.99972: variable 'network_provider' from source: set_fact 11792 1727096133.99991: variable 'omit' from source: magic vars 11792 1727096134.00013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096134.00038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096134.00055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096134.00070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096134.00078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096134.00100: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096134.00103: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096134.00106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096134.00191: Set connection var ansible_timeout to 10 11792 1727096134.00194: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096134.00204: Set connection var ansible_shell_executable to /bin/sh 11792 1727096134.00207: Set connection var ansible_pipelining to False 11792 1727096134.00210: Set connection var ansible_shell_type to sh 11792 1727096134.00213: Set connection var ansible_connection to ssh 11792 1727096134.00235: variable 'ansible_shell_executable' from source: unknown 11792 1727096134.00241: variable 'ansible_connection' from source: unknown 11792 1727096134.00244: variable 'ansible_module_compression' from source: unknown 11792 1727096134.00247: variable 'ansible_shell_type' from source: unknown 11792 1727096134.00274: variable 'ansible_shell_executable' from source: unknown 11792 1727096134.00277: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096134.00279: variable 'ansible_pipelining' from source: unknown 11792 1727096134.00281: variable 'ansible_timeout' from source: unknown 11792 1727096134.00283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096134.00427: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096134.00431: variable 'omit' from source: magic vars 11792 1727096134.00434: starting attempt loop 11792 1727096134.00436: running the handler 11792 1727096134.00587: variable 'ansible_facts' from source: unknown 11792 1727096134.01915: _low_level_execute_command(): starting 11792 1727096134.01939: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096134.02427: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096134.02432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096134.02436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096134.02474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096134.02487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096134.02536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096134.04515: stdout chunk (state=3): >>>/root <<< 11792 1727096134.04520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096134.04523: stdout chunk (state=3): >>><<< 11792 1727096134.04525: stderr chunk (state=3): >>><<< 11792 1727096134.04527: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096134.04539: _low_level_execute_command(): starting 11792 1727096134.04542: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149 `" && echo ansible-tmp-1727096134.044214-12521-59992016747149="` echo /root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149 `" ) && sleep 0' 11792 1727096134.05318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096134.05323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096134.05387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096134.05437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096134.07561: stdout chunk (state=3): >>>ansible-tmp-1727096134.044214-12521-59992016747149=/root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149 <<< 11792 1727096134.07705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096134.07747: stderr chunk (state=3): >>><<< 11792 1727096134.07753: stdout chunk (state=3): >>><<< 11792 1727096134.07773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096134.044214-12521-59992016747149=/root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096134.07808: variable 'ansible_module_compression' from source: unknown 11792 1727096134.07870: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11792 1727096134.07874: ANSIBALLZ: Acquiring lock 11792 1727096134.07877: ANSIBALLZ: Lock acquired: 139635227775856 11792 1727096134.07879: ANSIBALLZ: Creating module 11792 1727096134.31324: ANSIBALLZ: Writing module into payload 11792 1727096134.31509: ANSIBALLZ: Writing module 11792 1727096134.31542: ANSIBALLZ: Renaming module 11792 1727096134.31555: ANSIBALLZ: Done creating module 11792 1727096134.31604: variable 'ansible_facts' from source: unknown 11792 1727096134.31821: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/AnsiballZ_systemd.py 11792 1727096134.32044: Sending initial data 11792 1727096134.32151: Sent initial data (154 bytes) 11792 1727096134.32527: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096134.32531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096134.32542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096134.32605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096134.32610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096134.32617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096134.32647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096134.34584: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096134.34593: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096134.34726: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpum9f3m9m /root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/AnsiballZ_systemd.py <<< 11792 1727096134.34730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpum9f3m9m" to remote "/root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/AnsiballZ_systemd.py" <<< 11792 1727096134.36144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096134.36209: stderr chunk (state=3): >>><<< 11792 1727096134.36223: stdout chunk (state=3): >>><<< 11792 1727096134.36264: done transferring module to remote 11792 1727096134.36281: _low_level_execute_command(): starting 11792 1727096134.36291: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/ /root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/AnsiballZ_systemd.py && sleep 0' 11792 1727096134.36945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096134.36963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096134.36986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096134.37090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096134.37119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096134.37138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096134.37164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096134.37234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096134.39158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096134.39189: stdout chunk (state=3): >>><<< 11792 1727096134.39205: stderr chunk (state=3): >>><<< 11792 1727096134.39225: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096134.39233: _low_level_execute_command(): starting 11792 1727096134.39243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/AnsiballZ_systemd.py && sleep 0' 11792 1727096134.39865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096134.39886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096134.39901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096134.39927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096134.39945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096134.39963: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096134.40038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096134.40090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096134.40112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096134.40145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096134.40225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096134.69952: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "3862528", "MemoryPeak": "4386816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3291172864", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "393785000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 11792 1727096134.69977: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "syste<<< 11792 1727096134.69991: stdout chunk (state=3): >>>md-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11792 1727096134.71947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096134.71979: stderr chunk (state=3): >>><<< 11792 1727096134.71982: stdout chunk (state=3): >>><<< 11792 1727096134.72000: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "3862528", "MemoryPeak": "4386816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3291172864", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "393785000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096134.72121: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096134.72140: _low_level_execute_command(): starting 11792 1727096134.72144: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096134.044214-12521-59992016747149/ > /dev/null 2>&1 && sleep 0' 11792 1727096134.72610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096134.72614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096134.72617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096134.72619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096134.72621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096134.72678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096134.72681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096134.72683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096134.72723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096134.74633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096134.74637: stdout chunk (state=3): >>><<< 11792 1727096134.74640: stderr chunk (state=3): >>><<< 11792 1727096134.74658: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096134.74664: handler run complete 11792 1727096134.74706: attempt loop complete, returning result 11792 1727096134.74709: _execute() done 11792 1727096134.74711: dumping result to json 11792 1727096134.74726: done dumping result, returning 11792 1727096134.74734: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-d9c7-3fc0-000000000283] 11792 1727096134.74736: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000283 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096134.75054: no more pending results, returning what we have 11792 1727096134.75058: results queue empty 11792 1727096134.75058: checking for any_errors_fatal 11792 1727096134.75066: done checking for any_errors_fatal 11792 1727096134.75066: checking for max_fail_percentage 11792 1727096134.75070: done checking for max_fail_percentage 11792 1727096134.75070: checking to see if all hosts have failed and the running result is not ok 11792 1727096134.75071: done checking to see if all hosts have failed 11792 1727096134.75072: getting the remaining hosts for this loop 11792 1727096134.75074: done getting the remaining hosts for this loop 11792 1727096134.75078: getting the next task for host managed_node2 11792 1727096134.75084: done getting next task for host managed_node2 11792 1727096134.75087: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11792 1727096134.75100: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096134.75109: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000283 11792 1727096134.75112: WORKER PROCESS EXITING 11792 1727096134.75118: getting variables 11792 1727096134.75120: in VariableManager get_vars() 11792 1727096134.75149: Calling all_inventory to load vars for managed_node2 11792 1727096134.75154: Calling groups_inventory to load vars for managed_node2 11792 1727096134.75156: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096134.75165: Calling all_plugins_play to load vars for managed_node2 11792 1727096134.75171: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096134.75174: Calling groups_plugins_play to load vars for managed_node2 11792 1727096134.75957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096134.77122: done with get_vars() 11792 1727096134.77147: done getting variables 11792 1727096134.77208: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:34 -0400 (0:00:00.903) 0:00:17.051 ****** 11792 1727096134.77247: entering _queue_task() for managed_node2/service 11792 1727096134.77599: worker is 1 (out of 1 available) 11792 1727096134.77611: exiting _queue_task() for managed_node2/service 11792 1727096134.77627: done queuing things up, now waiting for results queue to drain 11792 1727096134.77628: waiting for pending results... 11792 1727096134.77835: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11792 1727096134.77919: in run() - task 0afff68d-5257-d9c7-3fc0-000000000284 11792 1727096134.77932: variable 'ansible_search_path' from source: unknown 11792 1727096134.77935: variable 'ansible_search_path' from source: unknown 11792 1727096134.77968: calling self._execute() 11792 1727096134.78039: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096134.78042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096134.78056: variable 'omit' from source: magic vars 11792 1727096134.78329: variable 'ansible_distribution_major_version' from source: facts 11792 1727096134.78338: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096134.78421: variable 'network_provider' from source: set_fact 11792 1727096134.78424: Evaluated conditional (network_provider == "nm"): True 11792 1727096134.78494: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096134.78555: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096134.78675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096134.80773: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096134.80777: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096134.80780: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096134.80798: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096134.80828: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096134.80907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096134.80942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096134.80979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096134.81027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096134.81046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096134.81098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096134.81124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096134.81152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096134.81197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096134.81215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096134.81259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096134.81290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096134.81316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096134.81356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096134.81376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096134.81521: variable 'network_connections' from source: include params 11792 1727096134.81540: variable 'controller_profile' from source: play vars 11792 1727096134.81611: variable 'controller_profile' from source: play vars 11792 1727096134.81772: variable 'controller_device' from source: play vars 11792 1727096134.81776: variable 'controller_device' from source: play vars 11792 1727096134.81778: variable 'port1_profile' from source: play vars 11792 1727096134.81780: variable 'port1_profile' from source: play vars 11792 1727096134.81782: variable 'dhcp_interface1' from source: play vars 11792 1727096134.81836: variable 'dhcp_interface1' from source: play vars 11792 1727096134.81848: variable 'controller_profile' from source: play vars 11792 1727096134.81910: variable 'controller_profile' from source: play vars 11792 1727096134.81923: variable 'port2_profile' from source: play vars 11792 1727096134.81985: variable 'port2_profile' from source: play vars 11792 1727096134.81997: variable 'dhcp_interface2' from source: play vars 11792 1727096134.82057: variable 'dhcp_interface2' from source: play vars 11792 1727096134.82071: variable 'controller_profile' from source: play vars 11792 1727096134.82146: variable 'controller_profile' from source: play vars 11792 1727096134.82218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096134.82379: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096134.82419: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096134.82452: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096134.82485: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096134.82529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096134.82554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096134.82585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096134.82623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096134.82677: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096134.82907: variable 'network_connections' from source: include params 11792 1727096134.82917: variable 'controller_profile' from source: play vars 11792 1727096134.82979: variable 'controller_profile' from source: play vars 11792 1727096134.82991: variable 'controller_device' from source: play vars 11792 1727096134.83052: variable 'controller_device' from source: play vars 11792 1727096134.83071: variable 'port1_profile' from source: play vars 11792 1727096134.83172: variable 'port1_profile' from source: play vars 11792 1727096134.83175: variable 'dhcp_interface1' from source: play vars 11792 1727096134.83203: variable 'dhcp_interface1' from source: play vars 11792 1727096134.83214: variable 'controller_profile' from source: play vars 11792 1727096134.83275: variable 'controller_profile' from source: play vars 11792 1727096134.83288: variable 'port2_profile' from source: play vars 11792 1727096134.83348: variable 'port2_profile' from source: play vars 11792 1727096134.83359: variable 'dhcp_interface2' from source: play vars 11792 1727096134.83421: variable 'dhcp_interface2' from source: play vars 11792 1727096134.83572: variable 'controller_profile' from source: play vars 11792 1727096134.83575: variable 'controller_profile' from source: play vars 11792 1727096134.83577: Evaluated conditional (__network_wpa_supplicant_required): False 11792 1727096134.83579: when evaluation is False, skipping this task 11792 1727096134.83582: _execute() done 11792 1727096134.83584: dumping result to json 11792 1727096134.83586: done dumping result, returning 11792 1727096134.83588: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-d9c7-3fc0-000000000284] 11792 1727096134.83590: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000284 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11792 1727096134.83714: no more pending results, returning what we have 11792 1727096134.83717: results queue empty 11792 1727096134.83718: checking for any_errors_fatal 11792 1727096134.83740: done checking for any_errors_fatal 11792 1727096134.83741: checking for max_fail_percentage 11792 1727096134.83742: done checking for max_fail_percentage 11792 1727096134.83743: checking to see if all hosts have failed and the running result is not ok 11792 1727096134.83744: done checking to see if all hosts have failed 11792 1727096134.83744: getting the remaining hosts for this loop 11792 1727096134.83746: done getting the remaining hosts for this loop 11792 1727096134.83749: getting the next task for host managed_node2 11792 1727096134.83758: done getting next task for host managed_node2 11792 1727096134.83761: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11792 1727096134.83772: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096134.83783: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000284 11792 1727096134.83785: WORKER PROCESS EXITING 11792 1727096134.83793: getting variables 11792 1727096134.83794: in VariableManager get_vars() 11792 1727096134.83829: Calling all_inventory to load vars for managed_node2 11792 1727096134.83832: Calling groups_inventory to load vars for managed_node2 11792 1727096134.83834: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096134.83844: Calling all_plugins_play to load vars for managed_node2 11792 1727096134.83847: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096134.83849: Calling groups_plugins_play to load vars for managed_node2 11792 1727096134.85407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096134.86878: done with get_vars() 11792 1727096134.86899: done getting variables 11792 1727096134.86951: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:34 -0400 (0:00:00.097) 0:00:17.149 ****** 11792 1727096134.86983: entering _queue_task() for managed_node2/service 11792 1727096134.87271: worker is 1 (out of 1 available) 11792 1727096134.87283: exiting _queue_task() for managed_node2/service 11792 1727096134.87295: done queuing things up, now waiting for results queue to drain 11792 1727096134.87296: waiting for pending results... 11792 1727096134.87571: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11792 1727096134.87691: in run() - task 0afff68d-5257-d9c7-3fc0-000000000285 11792 1727096134.87714: variable 'ansible_search_path' from source: unknown 11792 1727096134.87782: variable 'ansible_search_path' from source: unknown 11792 1727096134.87786: calling self._execute() 11792 1727096134.87853: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096134.87866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096134.87883: variable 'omit' from source: magic vars 11792 1727096134.88248: variable 'ansible_distribution_major_version' from source: facts 11792 1727096134.88265: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096134.88385: variable 'network_provider' from source: set_fact 11792 1727096134.88396: Evaluated conditional (network_provider == "initscripts"): False 11792 1727096134.88403: when evaluation is False, skipping this task 11792 1727096134.88409: _execute() done 11792 1727096134.88416: dumping result to json 11792 1727096134.88434: done dumping result, returning 11792 1727096134.88437: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-d9c7-3fc0-000000000285] 11792 1727096134.88443: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000285 11792 1727096134.88607: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000285 11792 1727096134.88610: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096134.88691: no more pending results, returning what we have 11792 1727096134.88694: results queue empty 11792 1727096134.88695: checking for any_errors_fatal 11792 1727096134.88707: done checking for any_errors_fatal 11792 1727096134.88708: checking for max_fail_percentage 11792 1727096134.88710: done checking for max_fail_percentage 11792 1727096134.88710: checking to see if all hosts have failed and the running result is not ok 11792 1727096134.88711: done checking to see if all hosts have failed 11792 1727096134.88712: getting the remaining hosts for this loop 11792 1727096134.88713: done getting the remaining hosts for this loop 11792 1727096134.88717: getting the next task for host managed_node2 11792 1727096134.88725: done getting next task for host managed_node2 11792 1727096134.88729: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11792 1727096134.88735: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096134.88750: getting variables 11792 1727096134.88751: in VariableManager get_vars() 11792 1727096134.88789: Calling all_inventory to load vars for managed_node2 11792 1727096134.88792: Calling groups_inventory to load vars for managed_node2 11792 1727096134.88795: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096134.88807: Calling all_plugins_play to load vars for managed_node2 11792 1727096134.88810: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096134.88812: Calling groups_plugins_play to load vars for managed_node2 11792 1727096134.90197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096134.91820: done with get_vars() 11792 1727096134.91844: done getting variables 11792 1727096134.91908: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:34 -0400 (0:00:00.049) 0:00:17.198 ****** 11792 1727096134.91944: entering _queue_task() for managed_node2/copy 11792 1727096134.92254: worker is 1 (out of 1 available) 11792 1727096134.92264: exiting _queue_task() for managed_node2/copy 11792 1727096134.92478: done queuing things up, now waiting for results queue to drain 11792 1727096134.92481: waiting for pending results... 11792 1727096134.92609: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11792 1727096134.92688: in run() - task 0afff68d-5257-d9c7-3fc0-000000000286 11792 1727096134.92715: variable 'ansible_search_path' from source: unknown 11792 1727096134.92723: variable 'ansible_search_path' from source: unknown 11792 1727096134.92762: calling self._execute() 11792 1727096134.92861: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096134.92876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096134.92922: variable 'omit' from source: magic vars 11792 1727096134.93279: variable 'ansible_distribution_major_version' from source: facts 11792 1727096134.93296: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096134.93420: variable 'network_provider' from source: set_fact 11792 1727096134.93434: Evaluated conditional (network_provider == "initscripts"): False 11792 1727096134.93444: when evaluation is False, skipping this task 11792 1727096134.93468: _execute() done 11792 1727096134.93472: dumping result to json 11792 1727096134.93474: done dumping result, returning 11792 1727096134.93477: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-d9c7-3fc0-000000000286] 11792 1727096134.93575: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000286 11792 1727096134.93647: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000286 11792 1727096134.93651: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11792 1727096134.93725: no more pending results, returning what we have 11792 1727096134.93729: results queue empty 11792 1727096134.93730: checking for any_errors_fatal 11792 1727096134.93737: done checking for any_errors_fatal 11792 1727096134.93737: checking for max_fail_percentage 11792 1727096134.93739: done checking for max_fail_percentage 11792 1727096134.93740: checking to see if all hosts have failed and the running result is not ok 11792 1727096134.93740: done checking to see if all hosts have failed 11792 1727096134.93741: getting the remaining hosts for this loop 11792 1727096134.93743: done getting the remaining hosts for this loop 11792 1727096134.93747: getting the next task for host managed_node2 11792 1727096134.93755: done getting next task for host managed_node2 11792 1727096134.93759: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11792 1727096134.93778: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096134.93794: getting variables 11792 1727096134.93796: in VariableManager get_vars() 11792 1727096134.93835: Calling all_inventory to load vars for managed_node2 11792 1727096134.93838: Calling groups_inventory to load vars for managed_node2 11792 1727096134.93840: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096134.93853: Calling all_plugins_play to load vars for managed_node2 11792 1727096134.93856: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096134.93859: Calling groups_plugins_play to load vars for managed_node2 11792 1727096134.95297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096134.97489: done with get_vars() 11792 1727096134.97518: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:34 -0400 (0:00:00.056) 0:00:17.255 ****** 11792 1727096134.97613: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11792 1727096134.97615: Creating lock for fedora.linux_system_roles.network_connections 11792 1727096134.97948: worker is 1 (out of 1 available) 11792 1727096134.97962: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11792 1727096134.97977: done queuing things up, now waiting for results queue to drain 11792 1727096134.97979: waiting for pending results... 11792 1727096134.98386: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11792 1727096134.98392: in run() - task 0afff68d-5257-d9c7-3fc0-000000000287 11792 1727096134.98411: variable 'ansible_search_path' from source: unknown 11792 1727096134.98417: variable 'ansible_search_path' from source: unknown 11792 1727096134.98454: calling self._execute() 11792 1727096134.98544: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096134.98556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096134.98573: variable 'omit' from source: magic vars 11792 1727096134.98942: variable 'ansible_distribution_major_version' from source: facts 11792 1727096134.99244: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096134.99247: variable 'omit' from source: magic vars 11792 1727096134.99249: variable 'omit' from source: magic vars 11792 1727096134.99605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096135.02063: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096135.02138: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096135.02190: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096135.02229: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096135.02265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096135.02349: variable 'network_provider' from source: set_fact 11792 1727096135.02499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096135.02531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096135.02564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096135.02616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096135.02636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096135.02719: variable 'omit' from source: magic vars 11792 1727096135.02835: variable 'omit' from source: magic vars 11792 1727096135.02965: variable 'network_connections' from source: include params 11792 1727096135.02984: variable 'controller_profile' from source: play vars 11792 1727096135.03053: variable 'controller_profile' from source: play vars 11792 1727096135.03073: variable 'controller_device' from source: play vars 11792 1727096135.03135: variable 'controller_device' from source: play vars 11792 1727096135.03372: variable 'port1_profile' from source: play vars 11792 1727096135.03375: variable 'port1_profile' from source: play vars 11792 1727096135.03377: variable 'dhcp_interface1' from source: play vars 11792 1727096135.03379: variable 'dhcp_interface1' from source: play vars 11792 1727096135.03381: variable 'controller_profile' from source: play vars 11792 1727096135.03383: variable 'controller_profile' from source: play vars 11792 1727096135.03385: variable 'port2_profile' from source: play vars 11792 1727096135.03431: variable 'port2_profile' from source: play vars 11792 1727096135.03443: variable 'dhcp_interface2' from source: play vars 11792 1727096135.03511: variable 'dhcp_interface2' from source: play vars 11792 1727096135.03523: variable 'controller_profile' from source: play vars 11792 1727096135.03587: variable 'controller_profile' from source: play vars 11792 1727096135.03821: variable 'omit' from source: magic vars 11792 1727096135.03838: variable '__lsr_ansible_managed' from source: task vars 11792 1727096135.03904: variable '__lsr_ansible_managed' from source: task vars 11792 1727096135.04157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11792 1727096135.04656: Loaded config def from plugin (lookup/template) 11792 1727096135.04666: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11792 1727096135.04702: File lookup term: get_ansible_managed.j2 11792 1727096135.04709: variable 'ansible_search_path' from source: unknown 11792 1727096135.04718: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11792 1727096135.04733: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11792 1727096135.04757: variable 'ansible_search_path' from source: unknown 11792 1727096135.12588: variable 'ansible_managed' from source: unknown 11792 1727096135.12773: variable 'omit' from source: magic vars 11792 1727096135.12776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096135.12789: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096135.12815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096135.12837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096135.12853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096135.12886: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096135.12894: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096135.12902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096135.13001: Set connection var ansible_timeout to 10 11792 1727096135.13031: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096135.13034: Set connection var ansible_shell_executable to /bin/sh 11792 1727096135.13038: Set connection var ansible_pipelining to False 11792 1727096135.13141: Set connection var ansible_shell_type to sh 11792 1727096135.13144: Set connection var ansible_connection to ssh 11792 1727096135.13147: variable 'ansible_shell_executable' from source: unknown 11792 1727096135.13149: variable 'ansible_connection' from source: unknown 11792 1727096135.13153: variable 'ansible_module_compression' from source: unknown 11792 1727096135.13155: variable 'ansible_shell_type' from source: unknown 11792 1727096135.13157: variable 'ansible_shell_executable' from source: unknown 11792 1727096135.13159: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096135.13161: variable 'ansible_pipelining' from source: unknown 11792 1727096135.13163: variable 'ansible_timeout' from source: unknown 11792 1727096135.13165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096135.13261: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096135.13278: variable 'omit' from source: magic vars 11792 1727096135.13290: starting attempt loop 11792 1727096135.13297: running the handler 11792 1727096135.13313: _low_level_execute_command(): starting 11792 1727096135.13323: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096135.14087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096135.14136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096135.14159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096135.14355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096135.14402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096135.16522: stdout chunk (state=3): >>>/root <<< 11792 1727096135.16560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096135.16563: stdout chunk (state=3): >>><<< 11792 1727096135.16566: stderr chunk (state=3): >>><<< 11792 1727096135.16629: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096135.16633: _low_level_execute_command(): starting 11792 1727096135.16636: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183 `" && echo ansible-tmp-1727096135.1658978-12569-268153631811183="` echo /root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183 `" ) && sleep 0' 11792 1727096135.17933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096135.18092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096135.18187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096135.18253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096135.20322: stdout chunk (state=3): >>>ansible-tmp-1727096135.1658978-12569-268153631811183=/root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183 <<< 11792 1727096135.20412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096135.20583: stderr chunk (state=3): >>><<< 11792 1727096135.20587: stdout chunk (state=3): >>><<< 11792 1727096135.20610: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096135.1658978-12569-268153631811183=/root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096135.20661: variable 'ansible_module_compression' from source: unknown 11792 1727096135.20712: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11792 1727096135.20720: ANSIBALLZ: Acquiring lock 11792 1727096135.20723: ANSIBALLZ: Lock acquired: 139635231902704 11792 1727096135.20725: ANSIBALLZ: Creating module 11792 1727096135.52919: ANSIBALLZ: Writing module into payload 11792 1727096135.53185: ANSIBALLZ: Writing module 11792 1727096135.53217: ANSIBALLZ: Renaming module 11792 1727096135.53229: ANSIBALLZ: Done creating module 11792 1727096135.53264: variable 'ansible_facts' from source: unknown 11792 1727096135.53382: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/AnsiballZ_network_connections.py 11792 1727096135.53596: Sending initial data 11792 1727096135.53599: Sent initial data (168 bytes) 11792 1727096135.54637: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096135.54641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096135.54644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096135.54649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096135.54679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096135.54844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096135.56604: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096135.56629: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096135.56697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp5p11u488 /root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/AnsiballZ_network_connections.py <<< 11792 1727096135.57057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/AnsiballZ_network_connections.py" <<< 11792 1727096135.57061: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp5p11u488" to remote "/root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/AnsiballZ_network_connections.py" <<< 11792 1727096135.57882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096135.57952: stderr chunk (state=3): >>><<< 11792 1727096135.57958: stdout chunk (state=3): >>><<< 11792 1727096135.58007: done transferring module to remote 11792 1727096135.58019: _low_level_execute_command(): starting 11792 1727096135.58022: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/ /root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/AnsiballZ_network_connections.py && sleep 0' 11792 1727096135.58902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096135.58906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096135.58908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096135.58910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096135.58914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096135.59011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096135.59014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096135.59045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096135.59136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096135.61069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096135.61074: stdout chunk (state=3): >>><<< 11792 1727096135.61080: stderr chunk (state=3): >>><<< 11792 1727096135.61102: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096135.61106: _low_level_execute_command(): starting 11792 1727096135.61110: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/AnsiballZ_network_connections.py && sleep 0' 11792 1727096135.62990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096136.12766: stdout chunk (state=3): >>> <<< 11792 1727096136.12773: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11792 1727096136.15237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096136.15241: stderr chunk (state=3): >>><<< 11792 1727096136.15243: stdout chunk (state=3): >>><<< 11792 1727096136.15270: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096136.15466: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': '802.3ad', 'ad_actor_sys_prio': 65535, 'ad_actor_system': '00:00:5e:00:53:5d', 'ad_select': 'stable', 'ad_user_port_key': 1023, 'all_ports_active': True, 'downdelay': 0, 'lacp_rate': 'slow', 'lp_interval': 128, 'miimon': 110, 'min_links': 0, 'num_grat_arp': 64, 'primary_reselect': 'better', 'resend_igmp': 225, 'updelay': 0, 'use_carrier': True, 'xmit_hash_policy': 'encap2+3'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096136.15501: _low_level_execute_command(): starting 11792 1727096136.15506: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096135.1658978-12569-268153631811183/ > /dev/null 2>&1 && sleep 0' 11792 1727096136.16842: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096136.16845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096136.16847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096136.16849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096136.16851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096136.16862: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096136.16876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096136.16893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096136.16986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096136.17215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096136.17244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096136.19575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096136.19580: stdout chunk (state=3): >>><<< 11792 1727096136.19582: stderr chunk (state=3): >>><<< 11792 1727096136.19585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096136.19588: handler run complete 11792 1727096136.19590: attempt loop complete, returning result 11792 1727096136.19592: _execute() done 11792 1727096136.19594: dumping result to json 11792 1727096136.19596: done dumping result, returning 11792 1727096136.19598: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-d9c7-3fc0-000000000287] 11792 1727096136.19600: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000287 11792 1727096136.19874: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000287 11792 1727096136.19877: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a (not-active) 11792 1727096136.20038: no more pending results, returning what we have 11792 1727096136.20042: results queue empty 11792 1727096136.20043: checking for any_errors_fatal 11792 1727096136.20050: done checking for any_errors_fatal 11792 1727096136.20051: checking for max_fail_percentage 11792 1727096136.20053: done checking for max_fail_percentage 11792 1727096136.20053: checking to see if all hosts have failed and the running result is not ok 11792 1727096136.20054: done checking to see if all hosts have failed 11792 1727096136.20055: getting the remaining hosts for this loop 11792 1727096136.20056: done getting the remaining hosts for this loop 11792 1727096136.20060: getting the next task for host managed_node2 11792 1727096136.20470: done getting next task for host managed_node2 11792 1727096136.20475: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11792 1727096136.20479: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096136.20496: getting variables 11792 1727096136.20499: in VariableManager get_vars() 11792 1727096136.20534: Calling all_inventory to load vars for managed_node2 11792 1727096136.20537: Calling groups_inventory to load vars for managed_node2 11792 1727096136.20539: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096136.20548: Calling all_plugins_play to load vars for managed_node2 11792 1727096136.20551: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096136.20553: Calling groups_plugins_play to load vars for managed_node2 11792 1727096136.22485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096136.24285: done with get_vars() 11792 1727096136.24317: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:36 -0400 (0:00:01.268) 0:00:18.523 ****** 11792 1727096136.24417: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11792 1727096136.24418: Creating lock for fedora.linux_system_roles.network_state 11792 1727096136.24800: worker is 1 (out of 1 available) 11792 1727096136.24815: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11792 1727096136.24833: done queuing things up, now waiting for results queue to drain 11792 1727096136.24834: waiting for pending results... 11792 1727096136.25130: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11792 1727096136.25292: in run() - task 0afff68d-5257-d9c7-3fc0-000000000288 11792 1727096136.25318: variable 'ansible_search_path' from source: unknown 11792 1727096136.25327: variable 'ansible_search_path' from source: unknown 11792 1727096136.25373: calling self._execute() 11792 1727096136.25482: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.25512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.25574: variable 'omit' from source: magic vars 11792 1727096136.26393: variable 'ansible_distribution_major_version' from source: facts 11792 1727096136.26398: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096136.26576: variable 'network_state' from source: role '' defaults 11792 1727096136.26593: Evaluated conditional (network_state != {}): False 11792 1727096136.26602: when evaluation is False, skipping this task 11792 1727096136.26618: _execute() done 11792 1727096136.26627: dumping result to json 11792 1727096136.26635: done dumping result, returning 11792 1727096136.26648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-d9c7-3fc0-000000000288] 11792 1727096136.26661: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000288 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096136.26927: no more pending results, returning what we have 11792 1727096136.26931: results queue empty 11792 1727096136.26932: checking for any_errors_fatal 11792 1727096136.26956: done checking for any_errors_fatal 11792 1727096136.26958: checking for max_fail_percentage 11792 1727096136.26960: done checking for max_fail_percentage 11792 1727096136.26960: checking to see if all hosts have failed and the running result is not ok 11792 1727096136.26961: done checking to see if all hosts have failed 11792 1727096136.26962: getting the remaining hosts for this loop 11792 1727096136.26964: done getting the remaining hosts for this loop 11792 1727096136.26970: getting the next task for host managed_node2 11792 1727096136.26979: done getting next task for host managed_node2 11792 1727096136.26982: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11792 1727096136.26988: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096136.27004: getting variables 11792 1727096136.27006: in VariableManager get_vars() 11792 1727096136.27045: Calling all_inventory to load vars for managed_node2 11792 1727096136.27048: Calling groups_inventory to load vars for managed_node2 11792 1727096136.27053: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096136.27084: Calling all_plugins_play to load vars for managed_node2 11792 1727096136.27089: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096136.27096: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000288 11792 1727096136.27098: WORKER PROCESS EXITING 11792 1727096136.27103: Calling groups_plugins_play to load vars for managed_node2 11792 1727096136.28775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096136.30709: done with get_vars() 11792 1727096136.30746: done getting variables 11792 1727096136.30813: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:36 -0400 (0:00:00.064) 0:00:18.587 ****** 11792 1727096136.30848: entering _queue_task() for managed_node2/debug 11792 1727096136.31357: worker is 1 (out of 1 available) 11792 1727096136.31371: exiting _queue_task() for managed_node2/debug 11792 1727096136.31384: done queuing things up, now waiting for results queue to drain 11792 1727096136.31386: waiting for pending results... 11792 1727096136.32007: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11792 1727096136.32161: in run() - task 0afff68d-5257-d9c7-3fc0-000000000289 11792 1727096136.32189: variable 'ansible_search_path' from source: unknown 11792 1727096136.32274: variable 'ansible_search_path' from source: unknown 11792 1727096136.32278: calling self._execute() 11792 1727096136.32341: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.32356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.32373: variable 'omit' from source: magic vars 11792 1727096136.32771: variable 'ansible_distribution_major_version' from source: facts 11792 1727096136.32788: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096136.32799: variable 'omit' from source: magic vars 11792 1727096136.32871: variable 'omit' from source: magic vars 11792 1727096136.32910: variable 'omit' from source: magic vars 11792 1727096136.32960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096136.33045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096136.33049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096136.33057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096136.33075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096136.33109: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096136.33119: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.33127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.33233: Set connection var ansible_timeout to 10 11792 1727096136.33262: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096136.33271: Set connection var ansible_shell_executable to /bin/sh 11792 1727096136.33472: Set connection var ansible_pipelining to False 11792 1727096136.33475: Set connection var ansible_shell_type to sh 11792 1727096136.33477: Set connection var ansible_connection to ssh 11792 1727096136.33479: variable 'ansible_shell_executable' from source: unknown 11792 1727096136.33481: variable 'ansible_connection' from source: unknown 11792 1727096136.33483: variable 'ansible_module_compression' from source: unknown 11792 1727096136.33485: variable 'ansible_shell_type' from source: unknown 11792 1727096136.33487: variable 'ansible_shell_executable' from source: unknown 11792 1727096136.33489: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.33491: variable 'ansible_pipelining' from source: unknown 11792 1727096136.33492: variable 'ansible_timeout' from source: unknown 11792 1727096136.33494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.33497: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096136.33513: variable 'omit' from source: magic vars 11792 1727096136.33523: starting attempt loop 11792 1727096136.33531: running the handler 11792 1727096136.33671: variable '__network_connections_result' from source: set_fact 11792 1727096136.33747: handler run complete 11792 1727096136.33776: attempt loop complete, returning result 11792 1727096136.33783: _execute() done 11792 1727096136.33790: dumping result to json 11792 1727096136.33797: done dumping result, returning 11792 1727096136.33810: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-d9c7-3fc0-000000000289] 11792 1727096136.33818: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000289 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a (not-active)" ] } 11792 1727096136.34010: no more pending results, returning what we have 11792 1727096136.34014: results queue empty 11792 1727096136.34015: checking for any_errors_fatal 11792 1727096136.34021: done checking for any_errors_fatal 11792 1727096136.34022: checking for max_fail_percentage 11792 1727096136.34024: done checking for max_fail_percentage 11792 1727096136.34025: checking to see if all hosts have failed and the running result is not ok 11792 1727096136.34025: done checking to see if all hosts have failed 11792 1727096136.34026: getting the remaining hosts for this loop 11792 1727096136.34027: done getting the remaining hosts for this loop 11792 1727096136.34031: getting the next task for host managed_node2 11792 1727096136.34040: done getting next task for host managed_node2 11792 1727096136.34044: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11792 1727096136.34049: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096136.34063: getting variables 11792 1727096136.34065: in VariableManager get_vars() 11792 1727096136.34109: Calling all_inventory to load vars for managed_node2 11792 1727096136.34112: Calling groups_inventory to load vars for managed_node2 11792 1727096136.34115: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096136.34127: Calling all_plugins_play to load vars for managed_node2 11792 1727096136.34130: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096136.34133: Calling groups_plugins_play to load vars for managed_node2 11792 1727096136.34866: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000289 11792 1727096136.34873: WORKER PROCESS EXITING 11792 1727096136.36360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096136.39890: done with get_vars() 11792 1727096136.39929: done getting variables 11792 1727096136.40008: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:36 -0400 (0:00:00.091) 0:00:18.679 ****** 11792 1727096136.40045: entering _queue_task() for managed_node2/debug 11792 1727096136.40807: worker is 1 (out of 1 available) 11792 1727096136.40819: exiting _queue_task() for managed_node2/debug 11792 1727096136.40833: done queuing things up, now waiting for results queue to drain 11792 1727096136.40838: waiting for pending results... 11792 1727096136.41289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11792 1727096136.41455: in run() - task 0afff68d-5257-d9c7-3fc0-00000000028a 11792 1727096136.41499: variable 'ansible_search_path' from source: unknown 11792 1727096136.41504: variable 'ansible_search_path' from source: unknown 11792 1727096136.41575: calling self._execute() 11792 1727096136.41687: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.41696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.41712: variable 'omit' from source: magic vars 11792 1727096136.42160: variable 'ansible_distribution_major_version' from source: facts 11792 1727096136.42171: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096136.42177: variable 'omit' from source: magic vars 11792 1727096136.42252: variable 'omit' from source: magic vars 11792 1727096136.42283: variable 'omit' from source: magic vars 11792 1727096136.42316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096136.42345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096136.42363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096136.42377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096136.42387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096136.42410: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096136.42413: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.42415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.42490: Set connection var ansible_timeout to 10 11792 1727096136.42497: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096136.42505: Set connection var ansible_shell_executable to /bin/sh 11792 1727096136.42509: Set connection var ansible_pipelining to False 11792 1727096136.42512: Set connection var ansible_shell_type to sh 11792 1727096136.42514: Set connection var ansible_connection to ssh 11792 1727096136.42531: variable 'ansible_shell_executable' from source: unknown 11792 1727096136.42533: variable 'ansible_connection' from source: unknown 11792 1727096136.42536: variable 'ansible_module_compression' from source: unknown 11792 1727096136.42538: variable 'ansible_shell_type' from source: unknown 11792 1727096136.42541: variable 'ansible_shell_executable' from source: unknown 11792 1727096136.42543: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.42555: variable 'ansible_pipelining' from source: unknown 11792 1727096136.42558: variable 'ansible_timeout' from source: unknown 11792 1727096136.42560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.42773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096136.42777: variable 'omit' from source: magic vars 11792 1727096136.42784: starting attempt loop 11792 1727096136.42786: running the handler 11792 1727096136.42789: variable '__network_connections_result' from source: set_fact 11792 1727096136.42862: variable '__network_connections_result' from source: set_fact 11792 1727096136.43083: handler run complete 11792 1727096136.43127: attempt loop complete, returning result 11792 1727096136.43135: _execute() done 11792 1727096136.43141: dumping result to json 11792 1727096136.43153: done dumping result, returning 11792 1727096136.43166: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-d9c7-3fc0-00000000028a] 11792 1727096136.43178: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000028a 11792 1727096136.43409: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000028a 11792 1727096136.43413: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a (not-active)" ] } } 11792 1727096136.43542: no more pending results, returning what we have 11792 1727096136.43546: results queue empty 11792 1727096136.43547: checking for any_errors_fatal 11792 1727096136.43556: done checking for any_errors_fatal 11792 1727096136.43557: checking for max_fail_percentage 11792 1727096136.43559: done checking for max_fail_percentage 11792 1727096136.43560: checking to see if all hosts have failed and the running result is not ok 11792 1727096136.43561: done checking to see if all hosts have failed 11792 1727096136.43561: getting the remaining hosts for this loop 11792 1727096136.43563: done getting the remaining hosts for this loop 11792 1727096136.43675: getting the next task for host managed_node2 11792 1727096136.43684: done getting next task for host managed_node2 11792 1727096136.43688: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11792 1727096136.43694: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096136.43706: getting variables 11792 1727096136.43708: in VariableManager get_vars() 11792 1727096136.43745: Calling all_inventory to load vars for managed_node2 11792 1727096136.43748: Calling groups_inventory to load vars for managed_node2 11792 1727096136.43754: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096136.43764: Calling all_plugins_play to load vars for managed_node2 11792 1727096136.43769: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096136.43772: Calling groups_plugins_play to load vars for managed_node2 11792 1727096136.45046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096136.47635: done with get_vars() 11792 1727096136.47784: done getting variables 11792 1727096136.47848: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:36 -0400 (0:00:00.079) 0:00:18.759 ****** 11792 1727096136.48003: entering _queue_task() for managed_node2/debug 11792 1727096136.48629: worker is 1 (out of 1 available) 11792 1727096136.48759: exiting _queue_task() for managed_node2/debug 11792 1727096136.48775: done queuing things up, now waiting for results queue to drain 11792 1727096136.48777: waiting for pending results... 11792 1727096136.49386: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11792 1727096136.49490: in run() - task 0afff68d-5257-d9c7-3fc0-00000000028b 11792 1727096136.49511: variable 'ansible_search_path' from source: unknown 11792 1727096136.49518: variable 'ansible_search_path' from source: unknown 11792 1727096136.49560: calling self._execute() 11792 1727096136.49872: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.49975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.49978: variable 'omit' from source: magic vars 11792 1727096136.50873: variable 'ansible_distribution_major_version' from source: facts 11792 1727096136.50877: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096136.50880: variable 'network_state' from source: role '' defaults 11792 1727096136.50883: Evaluated conditional (network_state != {}): False 11792 1727096136.50885: when evaluation is False, skipping this task 11792 1727096136.50887: _execute() done 11792 1727096136.50889: dumping result to json 11792 1727096136.50891: done dumping result, returning 11792 1727096136.50893: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-d9c7-3fc0-00000000028b] 11792 1727096136.50895: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000028b skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11792 1727096136.51209: no more pending results, returning what we have 11792 1727096136.51213: results queue empty 11792 1727096136.51213: checking for any_errors_fatal 11792 1727096136.51228: done checking for any_errors_fatal 11792 1727096136.51229: checking for max_fail_percentage 11792 1727096136.51230: done checking for max_fail_percentage 11792 1727096136.51231: checking to see if all hosts have failed and the running result is not ok 11792 1727096136.51232: done checking to see if all hosts have failed 11792 1727096136.51232: getting the remaining hosts for this loop 11792 1727096136.51234: done getting the remaining hosts for this loop 11792 1727096136.51237: getting the next task for host managed_node2 11792 1727096136.51244: done getting next task for host managed_node2 11792 1727096136.51247: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11792 1727096136.51260: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096136.51277: getting variables 11792 1727096136.51278: in VariableManager get_vars() 11792 1727096136.51314: Calling all_inventory to load vars for managed_node2 11792 1727096136.51316: Calling groups_inventory to load vars for managed_node2 11792 1727096136.51318: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096136.51329: Calling all_plugins_play to load vars for managed_node2 11792 1727096136.51332: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096136.51334: Calling groups_plugins_play to load vars for managed_node2 11792 1727096136.51884: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000028b 11792 1727096136.51888: WORKER PROCESS EXITING 11792 1727096136.54401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096136.58371: done with get_vars() 11792 1727096136.58403: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:36 -0400 (0:00:00.106) 0:00:18.865 ****** 11792 1727096136.58628: entering _queue_task() for managed_node2/ping 11792 1727096136.58630: Creating lock for ping 11792 1727096136.59789: worker is 1 (out of 1 available) 11792 1727096136.59802: exiting _queue_task() for managed_node2/ping 11792 1727096136.59815: done queuing things up, now waiting for results queue to drain 11792 1727096136.59817: waiting for pending results... 11792 1727096136.60439: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11792 1727096136.60973: in run() - task 0afff68d-5257-d9c7-3fc0-00000000028c 11792 1727096136.60977: variable 'ansible_search_path' from source: unknown 11792 1727096136.60980: variable 'ansible_search_path' from source: unknown 11792 1727096136.60983: calling self._execute() 11792 1727096136.60985: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.60988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.60990: variable 'omit' from source: magic vars 11792 1727096136.61725: variable 'ansible_distribution_major_version' from source: facts 11792 1727096136.61888: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096136.61901: variable 'omit' from source: magic vars 11792 1727096136.61975: variable 'omit' from source: magic vars 11792 1727096136.62012: variable 'omit' from source: magic vars 11792 1727096136.62114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096136.62159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096136.62574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096136.62577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096136.62580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096136.62582: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096136.62585: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.62587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.62589: Set connection var ansible_timeout to 10 11792 1727096136.62591: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096136.62776: Set connection var ansible_shell_executable to /bin/sh 11792 1727096136.62788: Set connection var ansible_pipelining to False 11792 1727096136.62795: Set connection var ansible_shell_type to sh 11792 1727096136.62807: Set connection var ansible_connection to ssh 11792 1727096136.62835: variable 'ansible_shell_executable' from source: unknown 11792 1727096136.62843: variable 'ansible_connection' from source: unknown 11792 1727096136.62853: variable 'ansible_module_compression' from source: unknown 11792 1727096136.62860: variable 'ansible_shell_type' from source: unknown 11792 1727096136.62870: variable 'ansible_shell_executable' from source: unknown 11792 1727096136.62878: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096136.62886: variable 'ansible_pipelining' from source: unknown 11792 1727096136.62892: variable 'ansible_timeout' from source: unknown 11792 1727096136.62903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096136.63304: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096136.63321: variable 'omit' from source: magic vars 11792 1727096136.63331: starting attempt loop 11792 1727096136.63339: running the handler 11792 1727096136.63362: _low_level_execute_command(): starting 11792 1727096136.63376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096136.64993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096136.65012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096136.65357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096136.67074: stdout chunk (state=3): >>>/root <<< 11792 1727096136.67204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096136.67216: stdout chunk (state=3): >>><<< 11792 1727096136.67228: stderr chunk (state=3): >>><<< 11792 1727096136.67296: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096136.67315: _low_level_execute_command(): starting 11792 1727096136.67561: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793 `" && echo ansible-tmp-1727096136.6730285-12656-244875122537793="` echo /root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793 `" ) && sleep 0' 11792 1727096136.68788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096136.68813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096136.68912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096136.68989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096136.71018: stdout chunk (state=3): >>>ansible-tmp-1727096136.6730285-12656-244875122537793=/root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793 <<< 11792 1727096136.71163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096136.71326: stderr chunk (state=3): >>><<< 11792 1727096136.71337: stdout chunk (state=3): >>><<< 11792 1727096136.71371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096136.6730285-12656-244875122537793=/root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096136.71774: variable 'ansible_module_compression' from source: unknown 11792 1727096136.71778: ANSIBALLZ: Using lock for ping 11792 1727096136.71780: ANSIBALLZ: Acquiring lock 11792 1727096136.71782: ANSIBALLZ: Lock acquired: 139635226449808 11792 1727096136.71784: ANSIBALLZ: Creating module 11792 1727096136.96891: ANSIBALLZ: Writing module into payload 11792 1727096136.96961: ANSIBALLZ: Writing module 11792 1727096136.97273: ANSIBALLZ: Renaming module 11792 1727096136.97278: ANSIBALLZ: Done creating module 11792 1727096136.97281: variable 'ansible_facts' from source: unknown 11792 1727096136.97309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/AnsiballZ_ping.py 11792 1727096136.97589: Sending initial data 11792 1727096136.97599: Sent initial data (153 bytes) 11792 1727096136.99091: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096136.99262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096136.99280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096136.99350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096137.01077: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096137.01294: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp22asjj0_" to remote "/root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/AnsiballZ_ping.py" <<< 11792 1727096137.01299: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp22asjj0_ /root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/AnsiballZ_ping.py <<< 11792 1727096137.02858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096137.02944: stderr chunk (state=3): >>><<< 11792 1727096137.02957: stdout chunk (state=3): >>><<< 11792 1727096137.02998: done transferring module to remote 11792 1727096137.03260: _low_level_execute_command(): starting 11792 1727096137.03263: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/ /root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/AnsiballZ_ping.py && sleep 0' 11792 1727096137.04391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096137.04490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096137.04569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096137.06675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096137.06688: stdout chunk (state=3): >>><<< 11792 1727096137.06701: stderr chunk (state=3): >>><<< 11792 1727096137.06725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096137.06740: _low_level_execute_command(): starting 11792 1727096137.07060: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/AnsiballZ_ping.py && sleep 0' 11792 1727096137.08290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096137.08416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096137.08625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096137.08672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096137.24263: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11792 1727096137.25715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096137.25728: stdout chunk (state=3): >>><<< 11792 1727096137.25741: stderr chunk (state=3): >>><<< 11792 1727096137.25763: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096137.25796: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096137.25812: _low_level_execute_command(): starting 11792 1727096137.26044: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096136.6730285-12656-244875122537793/ > /dev/null 2>&1 && sleep 0' 11792 1727096137.27209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096137.27225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096137.27241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096137.27259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096137.27359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096137.27585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096137.27740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096137.29693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096137.29704: stdout chunk (state=3): >>><<< 11792 1727096137.29716: stderr chunk (state=3): >>><<< 11792 1727096137.29737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096137.29753: handler run complete 11792 1727096137.29780: attempt loop complete, returning result 11792 1727096137.29974: _execute() done 11792 1727096137.29978: dumping result to json 11792 1727096137.29980: done dumping result, returning 11792 1727096137.29982: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-d9c7-3fc0-00000000028c] 11792 1727096137.29984: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000028c 11792 1727096137.30060: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000028c 11792 1727096137.30065: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11792 1727096137.30130: no more pending results, returning what we have 11792 1727096137.30139: results queue empty 11792 1727096137.30140: checking for any_errors_fatal 11792 1727096137.30147: done checking for any_errors_fatal 11792 1727096137.30148: checking for max_fail_percentage 11792 1727096137.30149: done checking for max_fail_percentage 11792 1727096137.30150: checking to see if all hosts have failed and the running result is not ok 11792 1727096137.30151: done checking to see if all hosts have failed 11792 1727096137.30151: getting the remaining hosts for this loop 11792 1727096137.30153: done getting the remaining hosts for this loop 11792 1727096137.30157: getting the next task for host managed_node2 11792 1727096137.30172: done getting next task for host managed_node2 11792 1727096137.30174: ^ task is: TASK: meta (role_complete) 11792 1727096137.30179: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096137.30189: getting variables 11792 1727096137.30191: in VariableManager get_vars() 11792 1727096137.30229: Calling all_inventory to load vars for managed_node2 11792 1727096137.30231: Calling groups_inventory to load vars for managed_node2 11792 1727096137.30234: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096137.30244: Calling all_plugins_play to load vars for managed_node2 11792 1727096137.30246: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096137.30249: Calling groups_plugins_play to load vars for managed_node2 11792 1727096137.33106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096137.37155: done with get_vars() 11792 1727096137.37187: done getting variables 11792 1727096137.37405: done queuing things up, now waiting for results queue to drain 11792 1727096137.37407: results queue empty 11792 1727096137.37408: checking for any_errors_fatal 11792 1727096137.37411: done checking for any_errors_fatal 11792 1727096137.37411: checking for max_fail_percentage 11792 1727096137.37413: done checking for max_fail_percentage 11792 1727096137.37413: checking to see if all hosts have failed and the running result is not ok 11792 1727096137.37414: done checking to see if all hosts have failed 11792 1727096137.37415: getting the remaining hosts for this loop 11792 1727096137.37416: done getting the remaining hosts for this loop 11792 1727096137.37429: getting the next task for host managed_node2 11792 1727096137.37435: done getting next task for host managed_node2 11792 1727096137.37437: ^ task is: TASK: Show result 11792 1727096137.37440: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096137.37443: getting variables 11792 1727096137.37444: in VariableManager get_vars() 11792 1727096137.37454: Calling all_inventory to load vars for managed_node2 11792 1727096137.37456: Calling groups_inventory to load vars for managed_node2 11792 1727096137.37459: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096137.37464: Calling all_plugins_play to load vars for managed_node2 11792 1727096137.37467: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096137.37506: Calling groups_plugins_play to load vars for managed_node2 11792 1727096137.41385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096137.43996: done with get_vars() 11792 1727096137.44030: done getting variables 11792 1727096137.44082: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:46 Monday 23 September 2024 08:55:37 -0400 (0:00:00.854) 0:00:19.720 ****** 11792 1727096137.44118: entering _queue_task() for managed_node2/debug 11792 1727096137.44509: worker is 1 (out of 1 available) 11792 1727096137.44521: exiting _queue_task() for managed_node2/debug 11792 1727096137.44532: done queuing things up, now waiting for results queue to drain 11792 1727096137.44533: waiting for pending results... 11792 1727096137.44841: running TaskExecutor() for managed_node2/TASK: Show result 11792 1727096137.45153: in run() - task 0afff68d-5257-d9c7-3fc0-0000000001c6 11792 1727096137.45274: variable 'ansible_search_path' from source: unknown 11792 1727096137.45277: variable 'ansible_search_path' from source: unknown 11792 1727096137.45280: calling self._execute() 11792 1727096137.45421: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.45428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.45440: variable 'omit' from source: magic vars 11792 1727096137.46968: variable 'ansible_distribution_major_version' from source: facts 11792 1727096137.46980: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096137.46988: variable 'omit' from source: magic vars 11792 1727096137.47203: variable 'omit' from source: magic vars 11792 1727096137.47403: variable 'omit' from source: magic vars 11792 1727096137.47485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096137.47614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096137.47635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096137.47657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096137.47830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096137.47881: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096137.47885: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.47887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.48333: Set connection var ansible_timeout to 10 11792 1727096137.48361: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096137.48365: Set connection var ansible_shell_executable to /bin/sh 11792 1727096137.48590: Set connection var ansible_pipelining to False 11792 1727096137.48593: Set connection var ansible_shell_type to sh 11792 1727096137.48596: Set connection var ansible_connection to ssh 11792 1727096137.48598: variable 'ansible_shell_executable' from source: unknown 11792 1727096137.48601: variable 'ansible_connection' from source: unknown 11792 1727096137.48604: variable 'ansible_module_compression' from source: unknown 11792 1727096137.48606: variable 'ansible_shell_type' from source: unknown 11792 1727096137.48608: variable 'ansible_shell_executable' from source: unknown 11792 1727096137.48610: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.48612: variable 'ansible_pipelining' from source: unknown 11792 1727096137.48613: variable 'ansible_timeout' from source: unknown 11792 1727096137.48615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.48883: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096137.48895: variable 'omit' from source: magic vars 11792 1727096137.48910: starting attempt loop 11792 1727096137.48914: running the handler 11792 1727096137.48958: variable '__network_connections_result' from source: set_fact 11792 1727096137.49050: variable '__network_connections_result' from source: set_fact 11792 1727096137.49707: handler run complete 11792 1727096137.49715: attempt loop complete, returning result 11792 1727096137.49717: _execute() done 11792 1727096137.49787: dumping result to json 11792 1727096137.49833: done dumping result, returning 11792 1727096137.49842: done running TaskExecutor() for managed_node2/TASK: Show result [0afff68d-5257-d9c7-3fc0-0000000001c6] 11792 1727096137.49846: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000001c6 11792 1727096137.50140: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000001c6 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d4ead546-ed37-4db8-b8f2-1191a6c9350f (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, b2e827b6-dfb9-4571-949e-e48e368f579a (not-active)" ] } } 11792 1727096137.50240: WORKER PROCESS EXITING 11792 1727096137.50296: no more pending results, returning what we have 11792 1727096137.50419: results queue empty 11792 1727096137.50420: checking for any_errors_fatal 11792 1727096137.50422: done checking for any_errors_fatal 11792 1727096137.50423: checking for max_fail_percentage 11792 1727096137.50425: done checking for max_fail_percentage 11792 1727096137.50426: checking to see if all hosts have failed and the running result is not ok 11792 1727096137.50427: done checking to see if all hosts have failed 11792 1727096137.50427: getting the remaining hosts for this loop 11792 1727096137.50429: done getting the remaining hosts for this loop 11792 1727096137.50433: getting the next task for host managed_node2 11792 1727096137.50442: done getting next task for host managed_node2 11792 1727096137.50446: ^ task is: TASK: Asserts 11792 1727096137.50449: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096137.50455: getting variables 11792 1727096137.50456: in VariableManager get_vars() 11792 1727096137.50498: Calling all_inventory to load vars for managed_node2 11792 1727096137.50501: Calling groups_inventory to load vars for managed_node2 11792 1727096137.50505: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096137.50519: Calling all_plugins_play to load vars for managed_node2 11792 1727096137.50522: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096137.50525: Calling groups_plugins_play to load vars for managed_node2 11792 1727096137.52848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096137.57414: done with get_vars() 11792 1727096137.57450: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Monday 23 September 2024 08:55:37 -0400 (0:00:00.137) 0:00:19.858 ****** 11792 1727096137.57978: entering _queue_task() for managed_node2/include_tasks 11792 1727096137.59021: worker is 1 (out of 1 available) 11792 1727096137.59034: exiting _queue_task() for managed_node2/include_tasks 11792 1727096137.59046: done queuing things up, now waiting for results queue to drain 11792 1727096137.59048: waiting for pending results... 11792 1727096137.59674: running TaskExecutor() for managed_node2/TASK: Asserts 11792 1727096137.59941: in run() - task 0afff68d-5257-d9c7-3fc0-00000000008d 11792 1727096137.59945: variable 'ansible_search_path' from source: unknown 11792 1727096137.60001: variable 'ansible_search_path' from source: unknown 11792 1727096137.60241: variable 'lsr_assert' from source: include params 11792 1727096137.60786: variable 'lsr_assert' from source: include params 11792 1727096137.61066: variable 'omit' from source: magic vars 11792 1727096137.61442: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.61446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.61688: variable 'omit' from source: magic vars 11792 1727096137.62144: variable 'ansible_distribution_major_version' from source: facts 11792 1727096137.62278: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096137.62281: variable 'item' from source: unknown 11792 1727096137.62343: variable 'item' from source: unknown 11792 1727096137.62548: variable 'item' from source: unknown 11792 1727096137.62554: variable 'item' from source: unknown 11792 1727096137.63113: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.63116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.63118: variable 'omit' from source: magic vars 11792 1727096137.63317: variable 'ansible_distribution_major_version' from source: facts 11792 1727096137.63320: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096137.63323: variable 'item' from source: unknown 11792 1727096137.63574: variable 'item' from source: unknown 11792 1727096137.63679: variable 'item' from source: unknown 11792 1727096137.63933: variable 'item' from source: unknown 11792 1727096137.64166: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.64266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.64272: variable 'omit' from source: magic vars 11792 1727096137.64775: variable 'ansible_distribution_major_version' from source: facts 11792 1727096137.64786: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096137.64808: variable 'item' from source: unknown 11792 1727096137.64958: variable 'item' from source: unknown 11792 1727096137.64961: variable 'item' from source: unknown 11792 1727096137.65089: variable 'item' from source: unknown 11792 1727096137.65414: dumping result to json 11792 1727096137.65417: done dumping result, returning 11792 1727096137.65420: done running TaskExecutor() for managed_node2/TASK: Asserts [0afff68d-5257-d9c7-3fc0-00000000008d] 11792 1727096137.65423: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008d 11792 1727096137.65476: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008d 11792 1727096137.65480: WORKER PROCESS EXITING 11792 1727096137.65517: no more pending results, returning what we have 11792 1727096137.65525: in VariableManager get_vars() 11792 1727096137.65566: Calling all_inventory to load vars for managed_node2 11792 1727096137.65571: Calling groups_inventory to load vars for managed_node2 11792 1727096137.65576: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096137.65598: Calling all_plugins_play to load vars for managed_node2 11792 1727096137.65601: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096137.65604: Calling groups_plugins_play to load vars for managed_node2 11792 1727096137.69399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096137.71246: done with get_vars() 11792 1727096137.71281: variable 'ansible_search_path' from source: unknown 11792 1727096137.71283: variable 'ansible_search_path' from source: unknown 11792 1727096137.71327: variable 'ansible_search_path' from source: unknown 11792 1727096137.71329: variable 'ansible_search_path' from source: unknown 11792 1727096137.71370: variable 'ansible_search_path' from source: unknown 11792 1727096137.71371: variable 'ansible_search_path' from source: unknown 11792 1727096137.71400: we have included files to process 11792 1727096137.71402: generating all_blocks data 11792 1727096137.71404: done generating all_blocks data 11792 1727096137.71409: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11792 1727096137.71410: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11792 1727096137.71413: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11792 1727096137.71581: in VariableManager get_vars() 11792 1727096137.71601: done with get_vars() 11792 1727096137.71607: variable 'item' from source: include params 11792 1727096137.71718: variable 'item' from source: include params 11792 1727096137.71748: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11792 1727096137.71839: in VariableManager get_vars() 11792 1727096137.71861: done with get_vars() 11792 1727096137.71997: done processing included file 11792 1727096137.71999: iterating over new_blocks loaded from include file 11792 1727096137.72000: in VariableManager get_vars() 11792 1727096137.72028: done with get_vars() 11792 1727096137.72030: filtering new block on tags 11792 1727096137.72108: done filtering new block on tags 11792 1727096137.72130: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml for managed_node2 => (item=tasks/assert_controller_device_present.yml) 11792 1727096137.72136: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11792 1727096137.72137: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11792 1727096137.72140: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11792 1727096137.72300: in VariableManager get_vars() 11792 1727096137.72318: done with get_vars() 11792 1727096137.72330: done processing included file 11792 1727096137.72332: iterating over new_blocks loaded from include file 11792 1727096137.72333: in VariableManager get_vars() 11792 1727096137.72348: done with get_vars() 11792 1727096137.72349: filtering new block on tags 11792 1727096137.72385: done filtering new block on tags 11792 1727096137.72388: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml for managed_node2 => (item=tasks/assert_bond_port_profile_present.yml) 11792 1727096137.72392: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11792 1727096137.72393: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11792 1727096137.72401: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11792 1727096137.72841: in VariableManager get_vars() 11792 1727096137.72878: done with get_vars() 11792 1727096137.72932: in VariableManager get_vars() 11792 1727096137.72954: done with get_vars() 11792 1727096137.72972: done processing included file 11792 1727096137.72975: iterating over new_blocks loaded from include file 11792 1727096137.72976: in VariableManager get_vars() 11792 1727096137.72995: done with get_vars() 11792 1727096137.72997: filtering new block on tags 11792 1727096137.73037: done filtering new block on tags 11792 1727096137.73040: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node2 => (item=tasks/assert_bond_options.yml) 11792 1727096137.73044: extending task lists for all hosts with included blocks 11792 1727096137.75470: done extending task lists 11792 1727096137.75472: done processing included files 11792 1727096137.75473: results queue empty 11792 1727096137.75474: checking for any_errors_fatal 11792 1727096137.75480: done checking for any_errors_fatal 11792 1727096137.75481: checking for max_fail_percentage 11792 1727096137.75482: done checking for max_fail_percentage 11792 1727096137.75482: checking to see if all hosts have failed and the running result is not ok 11792 1727096137.75483: done checking to see if all hosts have failed 11792 1727096137.75484: getting the remaining hosts for this loop 11792 1727096137.75485: done getting the remaining hosts for this loop 11792 1727096137.75488: getting the next task for host managed_node2 11792 1727096137.75493: done getting next task for host managed_node2 11792 1727096137.75495: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11792 1727096137.75498: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096137.75501: getting variables 11792 1727096137.75502: in VariableManager get_vars() 11792 1727096137.75514: Calling all_inventory to load vars for managed_node2 11792 1727096137.75516: Calling groups_inventory to load vars for managed_node2 11792 1727096137.75519: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096137.75525: Calling all_plugins_play to load vars for managed_node2 11792 1727096137.75527: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096137.75536: Calling groups_plugins_play to load vars for managed_node2 11792 1727096137.77536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096137.79314: done with get_vars() 11792 1727096137.79344: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:37 -0400 (0:00:00.214) 0:00:20.073 ****** 11792 1727096137.79442: entering _queue_task() for managed_node2/include_tasks 11792 1727096137.79861: worker is 1 (out of 1 available) 11792 1727096137.79973: exiting _queue_task() for managed_node2/include_tasks 11792 1727096137.79987: done queuing things up, now waiting for results queue to drain 11792 1727096137.79989: waiting for pending results... 11792 1727096137.80418: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11792 1727096137.80425: in run() - task 0afff68d-5257-d9c7-3fc0-0000000003f5 11792 1727096137.80429: variable 'ansible_search_path' from source: unknown 11792 1727096137.80437: variable 'ansible_search_path' from source: unknown 11792 1727096137.80441: calling self._execute() 11792 1727096137.80510: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.80514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.80589: variable 'omit' from source: magic vars 11792 1727096137.81251: variable 'ansible_distribution_major_version' from source: facts 11792 1727096137.81271: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096137.81275: _execute() done 11792 1727096137.81282: dumping result to json 11792 1727096137.81285: done dumping result, returning 11792 1727096137.81288: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-d9c7-3fc0-0000000003f5] 11792 1727096137.81291: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000003f5 11792 1727096137.81529: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000003f5 11792 1727096137.81533: WORKER PROCESS EXITING 11792 1727096137.81569: no more pending results, returning what we have 11792 1727096137.81574: in VariableManager get_vars() 11792 1727096137.81614: Calling all_inventory to load vars for managed_node2 11792 1727096137.81617: Calling groups_inventory to load vars for managed_node2 11792 1727096137.81621: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096137.81634: Calling all_plugins_play to load vars for managed_node2 11792 1727096137.81637: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096137.81640: Calling groups_plugins_play to load vars for managed_node2 11792 1727096137.83296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096137.90043: done with get_vars() 11792 1727096137.90072: variable 'ansible_search_path' from source: unknown 11792 1727096137.90074: variable 'ansible_search_path' from source: unknown 11792 1727096137.90119: we have included files to process 11792 1727096137.90120: generating all_blocks data 11792 1727096137.90121: done generating all_blocks data 11792 1727096137.90122: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096137.90123: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096137.90125: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096137.90308: done processing included file 11792 1727096137.90310: iterating over new_blocks loaded from include file 11792 1727096137.90312: in VariableManager get_vars() 11792 1727096137.90336: done with get_vars() 11792 1727096137.90337: filtering new block on tags 11792 1727096137.90370: done filtering new block on tags 11792 1727096137.90372: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11792 1727096137.90377: extending task lists for all hosts with included blocks 11792 1727096137.90596: done extending task lists 11792 1727096137.90598: done processing included files 11792 1727096137.90598: results queue empty 11792 1727096137.90599: checking for any_errors_fatal 11792 1727096137.90602: done checking for any_errors_fatal 11792 1727096137.90603: checking for max_fail_percentage 11792 1727096137.90604: done checking for max_fail_percentage 11792 1727096137.90604: checking to see if all hosts have failed and the running result is not ok 11792 1727096137.90605: done checking to see if all hosts have failed 11792 1727096137.90606: getting the remaining hosts for this loop 11792 1727096137.90607: done getting the remaining hosts for this loop 11792 1727096137.90610: getting the next task for host managed_node2 11792 1727096137.90614: done getting next task for host managed_node2 11792 1727096137.90616: ^ task is: TASK: Get stat for interface {{ interface }} 11792 1727096137.90620: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096137.90622: getting variables 11792 1727096137.90623: in VariableManager get_vars() 11792 1727096137.90633: Calling all_inventory to load vars for managed_node2 11792 1727096137.90635: Calling groups_inventory to load vars for managed_node2 11792 1727096137.90638: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096137.90650: Calling all_plugins_play to load vars for managed_node2 11792 1727096137.90655: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096137.90659: Calling groups_plugins_play to load vars for managed_node2 11792 1727096137.91897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096137.93580: done with get_vars() 11792 1727096137.93607: done getting variables 11792 1727096137.93771: variable 'interface' from source: task vars 11792 1727096137.93775: variable 'controller_device' from source: play vars 11792 1727096137.93841: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:55:37 -0400 (0:00:00.144) 0:00:20.217 ****** 11792 1727096137.93878: entering _queue_task() for managed_node2/stat 11792 1727096137.94499: worker is 1 (out of 1 available) 11792 1727096137.94506: exiting _queue_task() for managed_node2/stat 11792 1727096137.94516: done queuing things up, now waiting for results queue to drain 11792 1727096137.94517: waiting for pending results... 11792 1727096137.94826: running TaskExecutor() for managed_node2/TASK: Get stat for interface nm-bond 11792 1727096137.95008: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004af 11792 1727096137.95035: variable 'ansible_search_path' from source: unknown 11792 1727096137.95135: variable 'ansible_search_path' from source: unknown 11792 1727096137.95138: calling self._execute() 11792 1727096137.95188: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.95257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.95277: variable 'omit' from source: magic vars 11792 1727096137.95695: variable 'ansible_distribution_major_version' from source: facts 11792 1727096137.95711: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096137.95723: variable 'omit' from source: magic vars 11792 1727096137.95794: variable 'omit' from source: magic vars 11792 1727096137.95904: variable 'interface' from source: task vars 11792 1727096137.95914: variable 'controller_device' from source: play vars 11792 1727096137.95985: variable 'controller_device' from source: play vars 11792 1727096137.96018: variable 'omit' from source: magic vars 11792 1727096137.96071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096137.96120: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096137.96216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096137.96220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096137.96222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096137.96225: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096137.96228: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.96237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.96355: Set connection var ansible_timeout to 10 11792 1727096137.96374: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096137.96389: Set connection var ansible_shell_executable to /bin/sh 11792 1727096137.96399: Set connection var ansible_pipelining to False 11792 1727096137.96407: Set connection var ansible_shell_type to sh 11792 1727096137.96414: Set connection var ansible_connection to ssh 11792 1727096137.96446: variable 'ansible_shell_executable' from source: unknown 11792 1727096137.96458: variable 'ansible_connection' from source: unknown 11792 1727096137.96467: variable 'ansible_module_compression' from source: unknown 11792 1727096137.96540: variable 'ansible_shell_type' from source: unknown 11792 1727096137.96544: variable 'ansible_shell_executable' from source: unknown 11792 1727096137.96546: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096137.96548: variable 'ansible_pipelining' from source: unknown 11792 1727096137.96553: variable 'ansible_timeout' from source: unknown 11792 1727096137.96556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096137.96729: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096137.96756: variable 'omit' from source: magic vars 11792 1727096137.96771: starting attempt loop 11792 1727096137.96779: running the handler 11792 1727096137.96799: _low_level_execute_command(): starting 11792 1727096137.96811: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096137.97748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096137.97774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096137.97864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096137.99827: stdout chunk (state=3): >>>/root <<< 11792 1727096137.99933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096137.99942: stdout chunk (state=3): >>><<< 11792 1727096137.99949: stderr chunk (state=3): >>><<< 11792 1727096137.99970: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096137.99985: _low_level_execute_command(): starting 11792 1727096137.99993: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046 `" && echo ansible-tmp-1727096137.9997063-12698-104312929384046="` echo /root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046 `" ) && sleep 0' 11792 1727096138.00973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096138.00977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096138.01116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.01173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.03575: stdout chunk (state=3): >>>ansible-tmp-1727096137.9997063-12698-104312929384046=/root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046 <<< 11792 1727096138.03580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096138.03582: stdout chunk (state=3): >>><<< 11792 1727096138.03585: stderr chunk (state=3): >>><<< 11792 1727096138.03587: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096137.9997063-12698-104312929384046=/root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096138.03590: variable 'ansible_module_compression' from source: unknown 11792 1727096138.03592: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11792 1727096138.03594: variable 'ansible_facts' from source: unknown 11792 1727096138.03681: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/AnsiballZ_stat.py 11792 1727096138.03855: Sending initial data 11792 1727096138.03859: Sent initial data (153 bytes) 11792 1727096138.04498: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096138.04586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.04617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096138.04629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096138.04647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.04715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.06516: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096138.06532: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096138.06595: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpy4q8ece5 /root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/AnsiballZ_stat.py <<< 11792 1727096138.06629: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/AnsiballZ_stat.py" <<< 11792 1727096138.06879: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpy4q8ece5" to remote "/root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/AnsiballZ_stat.py" <<< 11792 1727096138.07491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096138.07673: stderr chunk (state=3): >>><<< 11792 1727096138.07676: stdout chunk (state=3): >>><<< 11792 1727096138.07704: done transferring module to remote 11792 1727096138.07718: _low_level_execute_command(): starting 11792 1727096138.07723: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/ /root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/AnsiballZ_stat.py && sleep 0' 11792 1727096138.08505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096138.08535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096138.08555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096138.08576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096138.08595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096138.08607: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096138.08621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.08687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.08736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096138.08786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.08864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.10990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096138.11006: stderr chunk (state=3): >>><<< 11792 1727096138.11020: stdout chunk (state=3): >>><<< 11792 1727096138.11177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096138.11181: _low_level_execute_command(): starting 11792 1727096138.11184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/AnsiballZ_stat.py && sleep 0' 11792 1727096138.12309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096138.12386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.12455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096138.12478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096138.12687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.12755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.29149: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27303, "dev": 23, "nlink": 1, "atime": 1727096135.9922762, "mtime": 1727096135.9922762, "ctime": 1727096135.9922762, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11792 1727096138.30858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096138.30891: stdout chunk (state=3): >>><<< 11792 1727096138.30908: stderr chunk (state=3): >>><<< 11792 1727096138.30936: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27303, "dev": 23, "nlink": 1, "atime": 1727096135.9922762, "mtime": 1727096135.9922762, "ctime": 1727096135.9922762, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096138.31033: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096138.31042: _low_level_execute_command(): starting 11792 1727096138.31055: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096137.9997063-12698-104312929384046/ > /dev/null 2>&1 && sleep 0' 11792 1727096138.31561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096138.31565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096138.31574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096138.31577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.31613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096138.31617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.31680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.33628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096138.33659: stderr chunk (state=3): >>><<< 11792 1727096138.33662: stdout chunk (state=3): >>><<< 11792 1727096138.33687: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096138.33693: handler run complete 11792 1727096138.33724: attempt loop complete, returning result 11792 1727096138.33727: _execute() done 11792 1727096138.33730: dumping result to json 11792 1727096138.33735: done dumping result, returning 11792 1727096138.33743: done running TaskExecutor() for managed_node2/TASK: Get stat for interface nm-bond [0afff68d-5257-d9c7-3fc0-0000000004af] 11792 1727096138.33747: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004af 11792 1727096138.33870: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004af 11792 1727096138.33873: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096135.9922762, "block_size": 4096, "blocks": 0, "ctime": 1727096135.9922762, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27303, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727096135.9922762, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11792 1727096138.33956: no more pending results, returning what we have 11792 1727096138.33960: results queue empty 11792 1727096138.33961: checking for any_errors_fatal 11792 1727096138.33963: done checking for any_errors_fatal 11792 1727096138.33963: checking for max_fail_percentage 11792 1727096138.33966: done checking for max_fail_percentage 11792 1727096138.33967: checking to see if all hosts have failed and the running result is not ok 11792 1727096138.33977: done checking to see if all hosts have failed 11792 1727096138.33977: getting the remaining hosts for this loop 11792 1727096138.33979: done getting the remaining hosts for this loop 11792 1727096138.33983: getting the next task for host managed_node2 11792 1727096138.33992: done getting next task for host managed_node2 11792 1727096138.33993: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11792 1727096138.33997: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096138.34002: getting variables 11792 1727096138.34003: in VariableManager get_vars() 11792 1727096138.34032: Calling all_inventory to load vars for managed_node2 11792 1727096138.34034: Calling groups_inventory to load vars for managed_node2 11792 1727096138.34037: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096138.34047: Calling all_plugins_play to load vars for managed_node2 11792 1727096138.34049: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096138.34055: Calling groups_plugins_play to load vars for managed_node2 11792 1727096138.34974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096138.35875: done with get_vars() 11792 1727096138.35896: done getting variables 11792 1727096138.35963: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096138.36059: variable 'interface' from source: task vars 11792 1727096138.36062: variable 'controller_device' from source: play vars 11792 1727096138.36106: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:38 -0400 (0:00:00.422) 0:00:20.640 ****** 11792 1727096138.36132: entering _queue_task() for managed_node2/assert 11792 1727096138.36450: worker is 1 (out of 1 available) 11792 1727096138.36466: exiting _queue_task() for managed_node2/assert 11792 1727096138.36787: done queuing things up, now waiting for results queue to drain 11792 1727096138.36789: waiting for pending results... 11792 1727096138.36828: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'nm-bond' 11792 1727096138.37011: in run() - task 0afff68d-5257-d9c7-3fc0-0000000003f6 11792 1727096138.37024: variable 'ansible_search_path' from source: unknown 11792 1727096138.37029: variable 'ansible_search_path' from source: unknown 11792 1727096138.37187: calling self._execute() 11792 1727096138.37404: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.37411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.37420: variable 'omit' from source: magic vars 11792 1727096138.38083: variable 'ansible_distribution_major_version' from source: facts 11792 1727096138.38088: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096138.38091: variable 'omit' from source: magic vars 11792 1727096138.38094: variable 'omit' from source: magic vars 11792 1727096138.38300: variable 'interface' from source: task vars 11792 1727096138.38305: variable 'controller_device' from source: play vars 11792 1727096138.38308: variable 'controller_device' from source: play vars 11792 1727096138.38310: variable 'omit' from source: magic vars 11792 1727096138.38326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096138.38373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096138.38393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096138.38410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096138.38421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096138.38462: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096138.38465: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.38470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.38623: Set connection var ansible_timeout to 10 11792 1727096138.38627: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096138.38630: Set connection var ansible_shell_executable to /bin/sh 11792 1727096138.38633: Set connection var ansible_pipelining to False 11792 1727096138.38636: Set connection var ansible_shell_type to sh 11792 1727096138.38639: Set connection var ansible_connection to ssh 11792 1727096138.38641: variable 'ansible_shell_executable' from source: unknown 11792 1727096138.38643: variable 'ansible_connection' from source: unknown 11792 1727096138.38646: variable 'ansible_module_compression' from source: unknown 11792 1727096138.38648: variable 'ansible_shell_type' from source: unknown 11792 1727096138.38653: variable 'ansible_shell_executable' from source: unknown 11792 1727096138.38656: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.38658: variable 'ansible_pipelining' from source: unknown 11792 1727096138.38661: variable 'ansible_timeout' from source: unknown 11792 1727096138.38663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.38829: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096138.38833: variable 'omit' from source: magic vars 11792 1727096138.38838: starting attempt loop 11792 1727096138.38844: running the handler 11792 1727096138.38981: variable 'interface_stat' from source: set_fact 11792 1727096138.39065: Evaluated conditional (interface_stat.stat.exists): True 11792 1727096138.39071: handler run complete 11792 1727096138.39074: attempt loop complete, returning result 11792 1727096138.39076: _execute() done 11792 1727096138.39079: dumping result to json 11792 1727096138.39081: done dumping result, returning 11792 1727096138.39084: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'nm-bond' [0afff68d-5257-d9c7-3fc0-0000000003f6] 11792 1727096138.39141: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000003f6 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096138.39298: no more pending results, returning what we have 11792 1727096138.39302: results queue empty 11792 1727096138.39303: checking for any_errors_fatal 11792 1727096138.39312: done checking for any_errors_fatal 11792 1727096138.39313: checking for max_fail_percentage 11792 1727096138.39315: done checking for max_fail_percentage 11792 1727096138.39316: checking to see if all hosts have failed and the running result is not ok 11792 1727096138.39317: done checking to see if all hosts have failed 11792 1727096138.39317: getting the remaining hosts for this loop 11792 1727096138.39319: done getting the remaining hosts for this loop 11792 1727096138.39323: getting the next task for host managed_node2 11792 1727096138.39334: done getting next task for host managed_node2 11792 1727096138.39337: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11792 1727096138.39342: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096138.39347: getting variables 11792 1727096138.39348: in VariableManager get_vars() 11792 1727096138.39386: Calling all_inventory to load vars for managed_node2 11792 1727096138.39389: Calling groups_inventory to load vars for managed_node2 11792 1727096138.39392: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096138.39403: Calling all_plugins_play to load vars for managed_node2 11792 1727096138.39406: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096138.39408: Calling groups_plugins_play to load vars for managed_node2 11792 1727096138.39981: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000003f6 11792 1727096138.39984: WORKER PROCESS EXITING 11792 1727096138.40290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096138.41197: done with get_vars() 11792 1727096138.41225: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml:3 Monday 23 September 2024 08:55:38 -0400 (0:00:00.051) 0:00:20.692 ****** 11792 1727096138.41322: entering _queue_task() for managed_node2/include_tasks 11792 1727096138.41651: worker is 1 (out of 1 available) 11792 1727096138.41665: exiting _queue_task() for managed_node2/include_tasks 11792 1727096138.41680: done queuing things up, now waiting for results queue to drain 11792 1727096138.41682: waiting for pending results... 11792 1727096138.41928: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 11792 1727096138.42040: in run() - task 0afff68d-5257-d9c7-3fc0-0000000003fb 11792 1727096138.42084: variable 'ansible_search_path' from source: unknown 11792 1727096138.42089: variable 'ansible_search_path' from source: unknown 11792 1727096138.42119: variable 'controller_profile' from source: play vars 11792 1727096138.42332: variable 'controller_profile' from source: play vars 11792 1727096138.42409: variable 'port1_profile' from source: play vars 11792 1727096138.42428: variable 'port1_profile' from source: play vars 11792 1727096138.42435: variable 'port2_profile' from source: play vars 11792 1727096138.42502: variable 'port2_profile' from source: play vars 11792 1727096138.42528: variable 'omit' from source: magic vars 11792 1727096138.42700: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.42742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.42746: variable 'omit' from source: magic vars 11792 1727096138.43075: variable 'ansible_distribution_major_version' from source: facts 11792 1727096138.43079: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096138.43082: variable 'bond_port_profile' from source: unknown 11792 1727096138.43125: variable 'bond_port_profile' from source: unknown 11792 1727096138.43350: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.43356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.43359: variable 'omit' from source: magic vars 11792 1727096138.43573: variable 'ansible_distribution_major_version' from source: facts 11792 1727096138.43579: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096138.43582: variable 'bond_port_profile' from source: unknown 11792 1727096138.43584: variable 'bond_port_profile' from source: unknown 11792 1727096138.43644: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.43648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.43650: variable 'omit' from source: magic vars 11792 1727096138.43865: variable 'ansible_distribution_major_version' from source: facts 11792 1727096138.43985: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096138.43992: variable 'bond_port_profile' from source: unknown 11792 1727096138.43994: variable 'bond_port_profile' from source: unknown 11792 1727096138.44055: dumping result to json 11792 1727096138.44058: done dumping result, returning 11792 1727096138.44061: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [0afff68d-5257-d9c7-3fc0-0000000003fb] 11792 1727096138.44063: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000003fb 11792 1727096138.44107: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000003fb 11792 1727096138.44110: WORKER PROCESS EXITING 11792 1727096138.44135: no more pending results, returning what we have 11792 1727096138.44140: in VariableManager get_vars() 11792 1727096138.44181: Calling all_inventory to load vars for managed_node2 11792 1727096138.44184: Calling groups_inventory to load vars for managed_node2 11792 1727096138.44187: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096138.44234: Calling all_plugins_play to load vars for managed_node2 11792 1727096138.44238: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096138.44241: Calling groups_plugins_play to load vars for managed_node2 11792 1727096138.46646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096138.47520: done with get_vars() 11792 1727096138.47536: variable 'ansible_search_path' from source: unknown 11792 1727096138.47537: variable 'ansible_search_path' from source: unknown 11792 1727096138.47545: variable 'item' from source: include params 11792 1727096138.47629: variable 'item' from source: include params 11792 1727096138.47659: variable 'ansible_search_path' from source: unknown 11792 1727096138.47660: variable 'ansible_search_path' from source: unknown 11792 1727096138.47665: variable 'item' from source: include params 11792 1727096138.47707: variable 'item' from source: include params 11792 1727096138.47731: variable 'ansible_search_path' from source: unknown 11792 1727096138.47732: variable 'ansible_search_path' from source: unknown 11792 1727096138.47736: variable 'item' from source: include params 11792 1727096138.47779: variable 'item' from source: include params 11792 1727096138.47797: we have included files to process 11792 1727096138.47798: generating all_blocks data 11792 1727096138.47799: done generating all_blocks data 11792 1727096138.47803: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.47803: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.47805: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.47938: in VariableManager get_vars() 11792 1727096138.47954: done with get_vars() 11792 1727096138.48126: done processing included file 11792 1727096138.48128: iterating over new_blocks loaded from include file 11792 1727096138.48129: in VariableManager get_vars() 11792 1727096138.48138: done with get_vars() 11792 1727096138.48139: filtering new block on tags 11792 1727096138.48183: done filtering new block on tags 11792 1727096138.48185: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0) 11792 1727096138.48188: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.48189: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.48191: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.48253: in VariableManager get_vars() 11792 1727096138.48270: done with get_vars() 11792 1727096138.48603: done processing included file 11792 1727096138.48604: iterating over new_blocks loaded from include file 11792 1727096138.48606: in VariableManager get_vars() 11792 1727096138.49106: done with get_vars() 11792 1727096138.49108: filtering new block on tags 11792 1727096138.49165: done filtering new block on tags 11792 1727096138.49170: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.0) 11792 1727096138.49175: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.49176: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.49179: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11792 1727096138.49500: in VariableManager get_vars() 11792 1727096138.49522: done with get_vars() 11792 1727096138.49881: done processing included file 11792 1727096138.49883: iterating over new_blocks loaded from include file 11792 1727096138.49884: in VariableManager get_vars() 11792 1727096138.49898: done with get_vars() 11792 1727096138.49900: filtering new block on tags 11792 1727096138.49950: done filtering new block on tags 11792 1727096138.49955: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.1) 11792 1727096138.49959: extending task lists for all hosts with included blocks 11792 1727096138.50243: done extending task lists 11792 1727096138.50245: done processing included files 11792 1727096138.50246: results queue empty 11792 1727096138.50246: checking for any_errors_fatal 11792 1727096138.50249: done checking for any_errors_fatal 11792 1727096138.50250: checking for max_fail_percentage 11792 1727096138.50254: done checking for max_fail_percentage 11792 1727096138.50254: checking to see if all hosts have failed and the running result is not ok 11792 1727096138.50255: done checking to see if all hosts have failed 11792 1727096138.50256: getting the remaining hosts for this loop 11792 1727096138.50257: done getting the remaining hosts for this loop 11792 1727096138.50260: getting the next task for host managed_node2 11792 1727096138.50264: done getting next task for host managed_node2 11792 1727096138.50266: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11792 1727096138.50272: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096138.50274: getting variables 11792 1727096138.50275: in VariableManager get_vars() 11792 1727096138.50284: Calling all_inventory to load vars for managed_node2 11792 1727096138.50286: Calling groups_inventory to load vars for managed_node2 11792 1727096138.50288: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096138.50294: Calling all_plugins_play to load vars for managed_node2 11792 1727096138.50296: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096138.50298: Calling groups_plugins_play to load vars for managed_node2 11792 1727096138.51564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096138.53163: done with get_vars() 11792 1727096138.53194: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:55:38 -0400 (0:00:00.119) 0:00:20.811 ****** 11792 1727096138.53279: entering _queue_task() for managed_node2/include_tasks 11792 1727096138.53637: worker is 1 (out of 1 available) 11792 1727096138.53649: exiting _queue_task() for managed_node2/include_tasks 11792 1727096138.53664: done queuing things up, now waiting for results queue to drain 11792 1727096138.53666: waiting for pending results... 11792 1727096138.53998: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11792 1727096138.54204: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004d9 11792 1727096138.54208: variable 'ansible_search_path' from source: unknown 11792 1727096138.54211: variable 'ansible_search_path' from source: unknown 11792 1727096138.54214: calling self._execute() 11792 1727096138.54233: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.54354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.54375: variable 'omit' from source: magic vars 11792 1727096138.54873: variable 'ansible_distribution_major_version' from source: facts 11792 1727096138.54903: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096138.54918: _execute() done 11792 1727096138.54930: dumping result to json 11792 1727096138.54969: done dumping result, returning 11792 1727096138.54981: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-d9c7-3fc0-0000000004d9] 11792 1727096138.54995: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004d9 11792 1727096138.55324: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004d9 11792 1727096138.55328: WORKER PROCESS EXITING 11792 1727096138.55375: no more pending results, returning what we have 11792 1727096138.55380: in VariableManager get_vars() 11792 1727096138.55425: Calling all_inventory to load vars for managed_node2 11792 1727096138.55432: Calling groups_inventory to load vars for managed_node2 11792 1727096138.55436: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096138.55456: Calling all_plugins_play to load vars for managed_node2 11792 1727096138.55460: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096138.55463: Calling groups_plugins_play to load vars for managed_node2 11792 1727096138.57362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096138.59253: done with get_vars() 11792 1727096138.59291: variable 'ansible_search_path' from source: unknown 11792 1727096138.59296: variable 'ansible_search_path' from source: unknown 11792 1727096138.59328: we have included files to process 11792 1727096138.59329: generating all_blocks data 11792 1727096138.59330: done generating all_blocks data 11792 1727096138.59331: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096138.59332: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096138.59334: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096138.60218: done processing included file 11792 1727096138.60219: iterating over new_blocks loaded from include file 11792 1727096138.60220: in VariableManager get_vars() 11792 1727096138.60243: done with get_vars() 11792 1727096138.60245: filtering new block on tags 11792 1727096138.60385: done filtering new block on tags 11792 1727096138.60391: in VariableManager get_vars() 11792 1727096138.60407: done with get_vars() 11792 1727096138.60409: filtering new block on tags 11792 1727096138.60464: done filtering new block on tags 11792 1727096138.60466: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11792 1727096138.60472: extending task lists for all hosts with included blocks 11792 1727096138.60694: done extending task lists 11792 1727096138.60695: done processing included files 11792 1727096138.60695: results queue empty 11792 1727096138.60696: checking for any_errors_fatal 11792 1727096138.60699: done checking for any_errors_fatal 11792 1727096138.60699: checking for max_fail_percentage 11792 1727096138.60700: done checking for max_fail_percentage 11792 1727096138.60700: checking to see if all hosts have failed and the running result is not ok 11792 1727096138.60701: done checking to see if all hosts have failed 11792 1727096138.60701: getting the remaining hosts for this loop 11792 1727096138.60705: done getting the remaining hosts for this loop 11792 1727096138.60708: getting the next task for host managed_node2 11792 1727096138.60715: done getting next task for host managed_node2 11792 1727096138.60717: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11792 1727096138.60721: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096138.60724: getting variables 11792 1727096138.60725: in VariableManager get_vars() 11792 1727096138.60734: Calling all_inventory to load vars for managed_node2 11792 1727096138.60736: Calling groups_inventory to load vars for managed_node2 11792 1727096138.60738: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096138.60743: Calling all_plugins_play to load vars for managed_node2 11792 1727096138.60744: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096138.60746: Calling groups_plugins_play to load vars for managed_node2 11792 1727096138.61912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096138.63066: done with get_vars() 11792 1727096138.63091: done getting variables 11792 1727096138.63127: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:55:38 -0400 (0:00:00.098) 0:00:20.910 ****** 11792 1727096138.63156: entering _queue_task() for managed_node2/set_fact 11792 1727096138.63432: worker is 1 (out of 1 available) 11792 1727096138.63444: exiting _queue_task() for managed_node2/set_fact 11792 1727096138.63458: done queuing things up, now waiting for results queue to drain 11792 1727096138.63459: waiting for pending results... 11792 1727096138.63642: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11792 1727096138.63726: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004fc 11792 1727096138.63739: variable 'ansible_search_path' from source: unknown 11792 1727096138.63742: variable 'ansible_search_path' from source: unknown 11792 1727096138.63774: calling self._execute() 11792 1727096138.63846: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.63850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.63860: variable 'omit' from source: magic vars 11792 1727096138.64149: variable 'ansible_distribution_major_version' from source: facts 11792 1727096138.64160: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096138.64166: variable 'omit' from source: magic vars 11792 1727096138.64207: variable 'omit' from source: magic vars 11792 1727096138.64235: variable 'omit' from source: magic vars 11792 1727096138.64272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096138.64300: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096138.64318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096138.64331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096138.64341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096138.64369: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096138.64373: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.64375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.64447: Set connection var ansible_timeout to 10 11792 1727096138.64455: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096138.64458: Set connection var ansible_shell_executable to /bin/sh 11792 1727096138.64474: Set connection var ansible_pipelining to False 11792 1727096138.64477: Set connection var ansible_shell_type to sh 11792 1727096138.64481: Set connection var ansible_connection to ssh 11792 1727096138.64508: variable 'ansible_shell_executable' from source: unknown 11792 1727096138.64512: variable 'ansible_connection' from source: unknown 11792 1727096138.64515: variable 'ansible_module_compression' from source: unknown 11792 1727096138.64517: variable 'ansible_shell_type' from source: unknown 11792 1727096138.64520: variable 'ansible_shell_executable' from source: unknown 11792 1727096138.64522: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.64524: variable 'ansible_pipelining' from source: unknown 11792 1727096138.64526: variable 'ansible_timeout' from source: unknown 11792 1727096138.64529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.64789: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096138.64793: variable 'omit' from source: magic vars 11792 1727096138.64795: starting attempt loop 11792 1727096138.64798: running the handler 11792 1727096138.64800: handler run complete 11792 1727096138.64802: attempt loop complete, returning result 11792 1727096138.64804: _execute() done 11792 1727096138.64806: dumping result to json 11792 1727096138.64808: done dumping result, returning 11792 1727096138.64979: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-d9c7-3fc0-0000000004fc] 11792 1727096138.64982: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004fc 11792 1727096138.65056: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004fc 11792 1727096138.65060: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11792 1727096138.65122: no more pending results, returning what we have 11792 1727096138.65126: results queue empty 11792 1727096138.65127: checking for any_errors_fatal 11792 1727096138.65129: done checking for any_errors_fatal 11792 1727096138.65129: checking for max_fail_percentage 11792 1727096138.65131: done checking for max_fail_percentage 11792 1727096138.65132: checking to see if all hosts have failed and the running result is not ok 11792 1727096138.65133: done checking to see if all hosts have failed 11792 1727096138.65134: getting the remaining hosts for this loop 11792 1727096138.65136: done getting the remaining hosts for this loop 11792 1727096138.65140: getting the next task for host managed_node2 11792 1727096138.65149: done getting next task for host managed_node2 11792 1727096138.65155: ^ task is: TASK: Stat profile file 11792 1727096138.65162: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096138.65166: getting variables 11792 1727096138.65170: in VariableManager get_vars() 11792 1727096138.65207: Calling all_inventory to load vars for managed_node2 11792 1727096138.65210: Calling groups_inventory to load vars for managed_node2 11792 1727096138.65213: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096138.65225: Calling all_plugins_play to load vars for managed_node2 11792 1727096138.65228: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096138.65231: Calling groups_plugins_play to load vars for managed_node2 11792 1727096138.66556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096138.68191: done with get_vars() 11792 1727096138.68222: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:55:38 -0400 (0:00:00.051) 0:00:20.962 ****** 11792 1727096138.68338: entering _queue_task() for managed_node2/stat 11792 1727096138.68690: worker is 1 (out of 1 available) 11792 1727096138.68706: exiting _queue_task() for managed_node2/stat 11792 1727096138.68722: done queuing things up, now waiting for results queue to drain 11792 1727096138.68725: waiting for pending results... 11792 1727096138.69130: running TaskExecutor() for managed_node2/TASK: Stat profile file 11792 1727096138.69265: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004fd 11792 1727096138.69282: variable 'ansible_search_path' from source: unknown 11792 1727096138.69285: variable 'ansible_search_path' from source: unknown 11792 1727096138.69288: calling self._execute() 11792 1727096138.69517: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.69609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.69613: variable 'omit' from source: magic vars 11792 1727096138.70365: variable 'ansible_distribution_major_version' from source: facts 11792 1727096138.70529: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096138.70534: variable 'omit' from source: magic vars 11792 1727096138.70617: variable 'omit' from source: magic vars 11792 1727096138.70767: variable 'profile' from source: include params 11792 1727096138.70834: variable 'bond_port_profile' from source: include params 11792 1727096138.70873: variable 'bond_port_profile' from source: include params 11792 1727096138.70901: variable 'omit' from source: magic vars 11792 1727096138.71002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096138.71057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096138.71092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096138.71116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096138.71134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096138.71271: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096138.71274: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.71277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.71329: Set connection var ansible_timeout to 10 11792 1727096138.71346: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096138.71366: Set connection var ansible_shell_executable to /bin/sh 11792 1727096138.71384: Set connection var ansible_pipelining to False 11792 1727096138.71503: Set connection var ansible_shell_type to sh 11792 1727096138.71507: Set connection var ansible_connection to ssh 11792 1727096138.71510: variable 'ansible_shell_executable' from source: unknown 11792 1727096138.71513: variable 'ansible_connection' from source: unknown 11792 1727096138.71515: variable 'ansible_module_compression' from source: unknown 11792 1727096138.71518: variable 'ansible_shell_type' from source: unknown 11792 1727096138.71520: variable 'ansible_shell_executable' from source: unknown 11792 1727096138.71523: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096138.71525: variable 'ansible_pipelining' from source: unknown 11792 1727096138.71528: variable 'ansible_timeout' from source: unknown 11792 1727096138.71531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096138.71767: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096138.71829: variable 'omit' from source: magic vars 11792 1727096138.71958: starting attempt loop 11792 1727096138.71962: running the handler 11792 1727096138.71965: _low_level_execute_command(): starting 11792 1727096138.71973: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096138.73547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096138.73551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096138.73564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.73584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.73660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096138.73693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.73784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.75528: stdout chunk (state=3): >>>/root <<< 11792 1727096138.75661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096138.75686: stderr chunk (state=3): >>><<< 11792 1727096138.75689: stdout chunk (state=3): >>><<< 11792 1727096138.75709: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096138.75725: _low_level_execute_command(): starting 11792 1727096138.75734: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995 `" && echo ansible-tmp-1727096138.7570965-12742-69524289690995="` echo /root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995 `" ) && sleep 0' 11792 1727096138.76315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096138.76342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.76353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096138.76355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.76418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096138.76422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096138.76448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.76477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.78508: stdout chunk (state=3): >>>ansible-tmp-1727096138.7570965-12742-69524289690995=/root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995 <<< 11792 1727096138.78610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096138.78636: stderr chunk (state=3): >>><<< 11792 1727096138.78640: stdout chunk (state=3): >>><<< 11792 1727096138.78657: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096138.7570965-12742-69524289690995=/root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096138.78705: variable 'ansible_module_compression' from source: unknown 11792 1727096138.78755: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11792 1727096138.78787: variable 'ansible_facts' from source: unknown 11792 1727096138.78856: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/AnsiballZ_stat.py 11792 1727096138.78964: Sending initial data 11792 1727096138.78970: Sent initial data (152 bytes) 11792 1727096138.79648: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096138.79652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.79655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096138.79657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.79669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096138.79684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096138.79696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.79779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.81448: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096138.81508: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096138.81547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp0mj6vkhu /root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/AnsiballZ_stat.py <<< 11792 1727096138.81550: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/AnsiballZ_stat.py" <<< 11792 1727096138.81590: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp0mj6vkhu" to remote "/root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/AnsiballZ_stat.py" <<< 11792 1727096138.81605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/AnsiballZ_stat.py" <<< 11792 1727096138.82280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096138.82344: stderr chunk (state=3): >>><<< 11792 1727096138.82347: stdout chunk (state=3): >>><<< 11792 1727096138.82488: done transferring module to remote 11792 1727096138.82492: _low_level_execute_command(): starting 11792 1727096138.82494: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/ /root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/AnsiballZ_stat.py && sleep 0' 11792 1727096138.83185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096138.83262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.83294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096138.85196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096138.85209: stderr chunk (state=3): >>><<< 11792 1727096138.85212: stdout chunk (state=3): >>><<< 11792 1727096138.85234: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096138.85237: _low_level_execute_command(): starting 11792 1727096138.85240: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/AnsiballZ_stat.py && sleep 0' 11792 1727096138.85820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096138.85829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096138.85865: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096138.85905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096138.85910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096138.85957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096139.02309: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11792 1727096139.03324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096139.03456: stderr chunk (state=3): >>><<< 11792 1727096139.03459: stdout chunk (state=3): >>><<< 11792 1727096139.03462: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096139.03465: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096139.03470: _low_level_execute_command(): starting 11792 1727096139.03473: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096138.7570965-12742-69524289690995/ > /dev/null 2>&1 && sleep 0' 11792 1727096139.04820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096139.04824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096139.04827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096139.04829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096139.04883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096139.04895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096139.05606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096139.07629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096139.07633: stdout chunk (state=3): >>><<< 11792 1727096139.07636: stderr chunk (state=3): >>><<< 11792 1727096139.07638: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096139.07641: handler run complete 11792 1727096139.07646: attempt loop complete, returning result 11792 1727096139.07648: _execute() done 11792 1727096139.07653: dumping result to json 11792 1727096139.07656: done dumping result, returning 11792 1727096139.07775: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0afff68d-5257-d9c7-3fc0-0000000004fd] 11792 1727096139.07778: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004fd ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11792 1727096139.07908: no more pending results, returning what we have 11792 1727096139.07913: results queue empty 11792 1727096139.07914: checking for any_errors_fatal 11792 1727096139.07920: done checking for any_errors_fatal 11792 1727096139.07920: checking for max_fail_percentage 11792 1727096139.07922: done checking for max_fail_percentage 11792 1727096139.07924: checking to see if all hosts have failed and the running result is not ok 11792 1727096139.07924: done checking to see if all hosts have failed 11792 1727096139.07925: getting the remaining hosts for this loop 11792 1727096139.07927: done getting the remaining hosts for this loop 11792 1727096139.07931: getting the next task for host managed_node2 11792 1727096139.07940: done getting next task for host managed_node2 11792 1727096139.07943: ^ task is: TASK: Set NM profile exist flag based on the profile files 11792 1727096139.07949: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096139.07954: getting variables 11792 1727096139.07955: in VariableManager get_vars() 11792 1727096139.07994: Calling all_inventory to load vars for managed_node2 11792 1727096139.07997: Calling groups_inventory to load vars for managed_node2 11792 1727096139.08000: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096139.08014: Calling all_plugins_play to load vars for managed_node2 11792 1727096139.08017: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096139.08021: Calling groups_plugins_play to load vars for managed_node2 11792 1727096139.08929: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004fd 11792 1727096139.08935: WORKER PROCESS EXITING 11792 1727096139.11277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096139.14839: done with get_vars() 11792 1727096139.14879: done getting variables 11792 1727096139.14938: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:55:39 -0400 (0:00:00.466) 0:00:21.428 ****** 11792 1727096139.14976: entering _queue_task() for managed_node2/set_fact 11792 1727096139.15457: worker is 1 (out of 1 available) 11792 1727096139.15473: exiting _queue_task() for managed_node2/set_fact 11792 1727096139.15485: done queuing things up, now waiting for results queue to drain 11792 1727096139.15486: waiting for pending results... 11792 1727096139.15723: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11792 1727096139.15968: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004fe 11792 1727096139.15972: variable 'ansible_search_path' from source: unknown 11792 1727096139.15975: variable 'ansible_search_path' from source: unknown 11792 1727096139.15979: calling self._execute() 11792 1727096139.15990: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.15996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.16006: variable 'omit' from source: magic vars 11792 1727096139.16388: variable 'ansible_distribution_major_version' from source: facts 11792 1727096139.16406: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096139.16533: variable 'profile_stat' from source: set_fact 11792 1727096139.16545: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096139.16548: when evaluation is False, skipping this task 11792 1727096139.16553: _execute() done 11792 1727096139.16556: dumping result to json 11792 1727096139.16559: done dumping result, returning 11792 1727096139.16562: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-d9c7-3fc0-0000000004fe] 11792 1727096139.16632: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004fe 11792 1727096139.16770: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004fe 11792 1727096139.16774: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096139.16826: no more pending results, returning what we have 11792 1727096139.16832: results queue empty 11792 1727096139.16833: checking for any_errors_fatal 11792 1727096139.16842: done checking for any_errors_fatal 11792 1727096139.16843: checking for max_fail_percentage 11792 1727096139.16845: done checking for max_fail_percentage 11792 1727096139.16846: checking to see if all hosts have failed and the running result is not ok 11792 1727096139.16846: done checking to see if all hosts have failed 11792 1727096139.16847: getting the remaining hosts for this loop 11792 1727096139.16849: done getting the remaining hosts for this loop 11792 1727096139.16852: getting the next task for host managed_node2 11792 1727096139.16859: done getting next task for host managed_node2 11792 1727096139.16862: ^ task is: TASK: Get NM profile info 11792 1727096139.16870: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096139.16875: getting variables 11792 1727096139.16876: in VariableManager get_vars() 11792 1727096139.16908: Calling all_inventory to load vars for managed_node2 11792 1727096139.16911: Calling groups_inventory to load vars for managed_node2 11792 1727096139.16915: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096139.16927: Calling all_plugins_play to load vars for managed_node2 11792 1727096139.16930: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096139.16933: Calling groups_plugins_play to load vars for managed_node2 11792 1727096139.19788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096139.21414: done with get_vars() 11792 1727096139.21450: done getting variables 11792 1727096139.21521: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:55:39 -0400 (0:00:00.065) 0:00:21.494 ****** 11792 1727096139.21563: entering _queue_task() for managed_node2/shell 11792 1727096139.21955: worker is 1 (out of 1 available) 11792 1727096139.22154: exiting _queue_task() for managed_node2/shell 11792 1727096139.22185: done queuing things up, now waiting for results queue to drain 11792 1727096139.22187: waiting for pending results... 11792 1727096139.22631: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11792 1727096139.23082: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004ff 11792 1727096139.23086: variable 'ansible_search_path' from source: unknown 11792 1727096139.23089: variable 'ansible_search_path' from source: unknown 11792 1727096139.23091: calling self._execute() 11792 1727096139.23317: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.23323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.23326: variable 'omit' from source: magic vars 11792 1727096139.24222: variable 'ansible_distribution_major_version' from source: facts 11792 1727096139.24235: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096139.24239: variable 'omit' from source: magic vars 11792 1727096139.24461: variable 'omit' from source: magic vars 11792 1727096139.24758: variable 'profile' from source: include params 11792 1727096139.24762: variable 'bond_port_profile' from source: include params 11792 1727096139.24949: variable 'bond_port_profile' from source: include params 11792 1727096139.24971: variable 'omit' from source: magic vars 11792 1727096139.25015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096139.25177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096139.25272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096139.25276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096139.25279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096139.25282: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096139.25284: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.25286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.25581: Set connection var ansible_timeout to 10 11792 1727096139.25599: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096139.25608: Set connection var ansible_shell_executable to /bin/sh 11792 1727096139.25613: Set connection var ansible_pipelining to False 11792 1727096139.25616: Set connection var ansible_shell_type to sh 11792 1727096139.25619: Set connection var ansible_connection to ssh 11792 1727096139.25645: variable 'ansible_shell_executable' from source: unknown 11792 1727096139.25648: variable 'ansible_connection' from source: unknown 11792 1727096139.25653: variable 'ansible_module_compression' from source: unknown 11792 1727096139.25655: variable 'ansible_shell_type' from source: unknown 11792 1727096139.25658: variable 'ansible_shell_executable' from source: unknown 11792 1727096139.25660: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.25662: variable 'ansible_pipelining' from source: unknown 11792 1727096139.25665: variable 'ansible_timeout' from source: unknown 11792 1727096139.25669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.26322: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096139.26326: variable 'omit' from source: magic vars 11792 1727096139.26329: starting attempt loop 11792 1727096139.26331: running the handler 11792 1727096139.26334: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096139.26336: _low_level_execute_command(): starting 11792 1727096139.26338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096139.28076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096139.28084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096139.28087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096139.28091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096139.28156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096139.28189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096139.29995: stdout chunk (state=3): >>>/root <<< 11792 1727096139.30173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096139.30177: stderr chunk (state=3): >>><<< 11792 1727096139.30180: stdout chunk (state=3): >>><<< 11792 1727096139.30246: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096139.30262: _low_level_execute_command(): starting 11792 1727096139.30271: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082 `" && echo ansible-tmp-1727096139.302471-12772-127495477027082="` echo /root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082 `" ) && sleep 0' 11792 1727096139.31548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096139.31566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096139.31789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096139.31821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096139.33861: stdout chunk (state=3): >>>ansible-tmp-1727096139.302471-12772-127495477027082=/root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082 <<< 11792 1727096139.34146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096139.34153: stderr chunk (state=3): >>><<< 11792 1727096139.34156: stdout chunk (state=3): >>><<< 11792 1727096139.34181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096139.302471-12772-127495477027082=/root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096139.34474: variable 'ansible_module_compression' from source: unknown 11792 1727096139.34477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096139.34484: variable 'ansible_facts' from source: unknown 11792 1727096139.34486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/AnsiballZ_command.py 11792 1727096139.34616: Sending initial data 11792 1727096139.34620: Sent initial data (155 bytes) 11792 1727096139.35574: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096139.35578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096139.35582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096139.35701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096139.35745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096139.37416: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096139.37423: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11792 1727096139.37442: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11792 1727096139.37448: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11792 1727096139.37456: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11792 1727096139.37463: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 11792 1727096139.37470: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 11792 1727096139.37489: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096139.37557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096139.37590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp17od3xtv /root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/AnsiballZ_command.py <<< 11792 1727096139.37594: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/AnsiballZ_command.py" <<< 11792 1727096139.37797: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp17od3xtv" to remote "/root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/AnsiballZ_command.py" <<< 11792 1727096139.38932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096139.38935: stdout chunk (state=3): >>><<< 11792 1727096139.38943: stderr chunk (state=3): >>><<< 11792 1727096139.39066: done transferring module to remote 11792 1727096139.39082: _low_level_execute_command(): starting 11792 1727096139.39085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/ /root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/AnsiballZ_command.py && sleep 0' 11792 1727096139.39924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096139.39940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096139.39958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096139.39980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096139.39999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096139.40019: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096139.40036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096139.40062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096139.40080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096139.40091: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096139.40103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096139.40116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096139.40165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096139.40218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096139.40253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096139.40394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096139.42854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096139.42858: stdout chunk (state=3): >>><<< 11792 1727096139.42861: stderr chunk (state=3): >>><<< 11792 1727096139.43158: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096139.43162: _low_level_execute_command(): starting 11792 1727096139.43165: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/AnsiballZ_command.py && sleep 0' 11792 1727096139.43878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096139.43992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096139.44017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096139.44035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096139.44116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096139.73726: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-23 08:55:39.600281", "end": "2024-09-23 08:55:39.735899", "delta": "0:00:00.135618", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096139.75563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096139.75809: stdout chunk (state=3): >>><<< 11792 1727096139.75813: stderr chunk (state=3): >>><<< 11792 1727096139.75815: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-23 08:55:39.600281", "end": "2024-09-23 08:55:39.735899", "delta": "0:00:00.135618", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096139.75818: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096139.75825: _low_level_execute_command(): starting 11792 1727096139.75828: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096139.302471-12772-127495477027082/ > /dev/null 2>&1 && sleep 0' 11792 1727096139.77797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096139.77973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096139.78187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096139.80029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096139.80044: stdout chunk (state=3): >>><<< 11792 1727096139.80063: stderr chunk (state=3): >>><<< 11792 1727096139.80092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096139.80112: handler run complete 11792 1727096139.80143: Evaluated conditional (False): False 11792 1727096139.80163: attempt loop complete, returning result 11792 1727096139.80175: _execute() done 11792 1727096139.80182: dumping result to json 11792 1727096139.80192: done dumping result, returning 11792 1727096139.80211: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0afff68d-5257-d9c7-3fc0-0000000004ff] 11792 1727096139.80272: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004ff 11792 1727096139.80486: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004ff ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.135618", "end": "2024-09-23 08:55:39.735899", "rc": 0, "start": "2024-09-23 08:55:39.600281" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11792 1727096139.80553: no more pending results, returning what we have 11792 1727096139.80557: results queue empty 11792 1727096139.80558: checking for any_errors_fatal 11792 1727096139.80566: done checking for any_errors_fatal 11792 1727096139.80566: checking for max_fail_percentage 11792 1727096139.80570: done checking for max_fail_percentage 11792 1727096139.80571: checking to see if all hosts have failed and the running result is not ok 11792 1727096139.80572: done checking to see if all hosts have failed 11792 1727096139.80573: getting the remaining hosts for this loop 11792 1727096139.80574: done getting the remaining hosts for this loop 11792 1727096139.80577: getting the next task for host managed_node2 11792 1727096139.80585: done getting next task for host managed_node2 11792 1727096139.80589: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11792 1727096139.80714: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096139.80719: getting variables 11792 1727096139.80720: in VariableManager get_vars() 11792 1727096139.80752: Calling all_inventory to load vars for managed_node2 11792 1727096139.80755: Calling groups_inventory to load vars for managed_node2 11792 1727096139.80758: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096139.80774: Calling all_plugins_play to load vars for managed_node2 11792 1727096139.80778: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096139.80783: WORKER PROCESS EXITING 11792 1727096139.80788: Calling groups_plugins_play to load vars for managed_node2 11792 1727096139.82578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096139.84868: done with get_vars() 11792 1727096139.84900: done getting variables 11792 1727096139.84962: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:55:39 -0400 (0:00:00.634) 0:00:22.129 ****** 11792 1727096139.85004: entering _queue_task() for managed_node2/set_fact 11792 1727096139.85574: worker is 1 (out of 1 available) 11792 1727096139.85586: exiting _queue_task() for managed_node2/set_fact 11792 1727096139.85597: done queuing things up, now waiting for results queue to drain 11792 1727096139.85598: waiting for pending results... 11792 1727096139.85993: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11792 1727096139.86373: in run() - task 0afff68d-5257-d9c7-3fc0-000000000500 11792 1727096139.86377: variable 'ansible_search_path' from source: unknown 11792 1727096139.86380: variable 'ansible_search_path' from source: unknown 11792 1727096139.86383: calling self._execute() 11792 1727096139.86644: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.86650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.86659: variable 'omit' from source: magic vars 11792 1727096139.87294: variable 'ansible_distribution_major_version' from source: facts 11792 1727096139.87306: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096139.87443: variable 'nm_profile_exists' from source: set_fact 11792 1727096139.87457: Evaluated conditional (nm_profile_exists.rc == 0): True 11792 1727096139.87464: variable 'omit' from source: magic vars 11792 1727096139.87528: variable 'omit' from source: magic vars 11792 1727096139.87598: variable 'omit' from source: magic vars 11792 1727096139.87641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096139.87690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096139.87711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096139.87728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096139.87865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096139.87871: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096139.87897: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.87901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.87904: Set connection var ansible_timeout to 10 11792 1727096139.87921: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096139.87930: Set connection var ansible_shell_executable to /bin/sh 11792 1727096139.87936: Set connection var ansible_pipelining to False 11792 1727096139.87938: Set connection var ansible_shell_type to sh 11792 1727096139.87941: Set connection var ansible_connection to ssh 11792 1727096139.87966: variable 'ansible_shell_executable' from source: unknown 11792 1727096139.87971: variable 'ansible_connection' from source: unknown 11792 1727096139.87973: variable 'ansible_module_compression' from source: unknown 11792 1727096139.87976: variable 'ansible_shell_type' from source: unknown 11792 1727096139.87978: variable 'ansible_shell_executable' from source: unknown 11792 1727096139.87989: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.87992: variable 'ansible_pipelining' from source: unknown 11792 1727096139.87996: variable 'ansible_timeout' from source: unknown 11792 1727096139.87999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.88208: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096139.88211: variable 'omit' from source: magic vars 11792 1727096139.88214: starting attempt loop 11792 1727096139.88216: running the handler 11792 1727096139.88218: handler run complete 11792 1727096139.88221: attempt loop complete, returning result 11792 1727096139.88223: _execute() done 11792 1727096139.88225: dumping result to json 11792 1727096139.88228: done dumping result, returning 11792 1727096139.88231: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-d9c7-3fc0-000000000500] 11792 1727096139.88233: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000500 11792 1727096139.88359: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000500 11792 1727096139.88363: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11792 1727096139.88430: no more pending results, returning what we have 11792 1727096139.88434: results queue empty 11792 1727096139.88434: checking for any_errors_fatal 11792 1727096139.88442: done checking for any_errors_fatal 11792 1727096139.88443: checking for max_fail_percentage 11792 1727096139.88445: done checking for max_fail_percentage 11792 1727096139.88446: checking to see if all hosts have failed and the running result is not ok 11792 1727096139.88447: done checking to see if all hosts have failed 11792 1727096139.88448: getting the remaining hosts for this loop 11792 1727096139.88450: done getting the remaining hosts for this loop 11792 1727096139.88454: getting the next task for host managed_node2 11792 1727096139.88465: done getting next task for host managed_node2 11792 1727096139.88469: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11792 1727096139.88475: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096139.88481: getting variables 11792 1727096139.88482: in VariableManager get_vars() 11792 1727096139.88519: Calling all_inventory to load vars for managed_node2 11792 1727096139.88638: Calling groups_inventory to load vars for managed_node2 11792 1727096139.88642: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096139.88654: Calling all_plugins_play to load vars for managed_node2 11792 1727096139.88657: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096139.88661: Calling groups_plugins_play to load vars for managed_node2 11792 1727096139.90228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096139.92054: done with get_vars() 11792 1727096139.92093: done getting variables 11792 1727096139.92165: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096139.92289: variable 'profile' from source: include params 11792 1727096139.92293: variable 'bond_port_profile' from source: include params 11792 1727096139.92356: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:55:39 -0400 (0:00:00.073) 0:00:22.203 ****** 11792 1727096139.92389: entering _queue_task() for managed_node2/command 11792 1727096139.92744: worker is 1 (out of 1 available) 11792 1727096139.92765: exiting _queue_task() for managed_node2/command 11792 1727096139.92781: done queuing things up, now waiting for results queue to drain 11792 1727096139.92783: waiting for pending results... 11792 1727096139.93095: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 11792 1727096139.93289: in run() - task 0afff68d-5257-d9c7-3fc0-000000000502 11792 1727096139.93297: variable 'ansible_search_path' from source: unknown 11792 1727096139.93300: variable 'ansible_search_path' from source: unknown 11792 1727096139.93303: calling self._execute() 11792 1727096139.93321: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.93327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.93337: variable 'omit' from source: magic vars 11792 1727096139.93732: variable 'ansible_distribution_major_version' from source: facts 11792 1727096139.93736: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096139.93838: variable 'profile_stat' from source: set_fact 11792 1727096139.93854: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096139.93858: when evaluation is False, skipping this task 11792 1727096139.93861: _execute() done 11792 1727096139.93863: dumping result to json 11792 1727096139.93865: done dumping result, returning 11792 1727096139.93869: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 [0afff68d-5257-d9c7-3fc0-000000000502] 11792 1727096139.93949: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000502 11792 1727096139.94013: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000502 11792 1727096139.94016: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096139.94070: no more pending results, returning what we have 11792 1727096139.94074: results queue empty 11792 1727096139.94074: checking for any_errors_fatal 11792 1727096139.94083: done checking for any_errors_fatal 11792 1727096139.94083: checking for max_fail_percentage 11792 1727096139.94085: done checking for max_fail_percentage 11792 1727096139.94086: checking to see if all hosts have failed and the running result is not ok 11792 1727096139.94086: done checking to see if all hosts have failed 11792 1727096139.94087: getting the remaining hosts for this loop 11792 1727096139.94089: done getting the remaining hosts for this loop 11792 1727096139.94092: getting the next task for host managed_node2 11792 1727096139.94099: done getting next task for host managed_node2 11792 1727096139.94101: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11792 1727096139.94105: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096139.94109: getting variables 11792 1727096139.94110: in VariableManager get_vars() 11792 1727096139.94136: Calling all_inventory to load vars for managed_node2 11792 1727096139.94138: Calling groups_inventory to load vars for managed_node2 11792 1727096139.94141: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096139.94150: Calling all_plugins_play to load vars for managed_node2 11792 1727096139.94155: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096139.94157: Calling groups_plugins_play to load vars for managed_node2 11792 1727096139.95720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096139.97357: done with get_vars() 11792 1727096139.97393: done getting variables 11792 1727096139.97462: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096139.97596: variable 'profile' from source: include params 11792 1727096139.97600: variable 'bond_port_profile' from source: include params 11792 1727096139.97662: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:55:39 -0400 (0:00:00.053) 0:00:22.256 ****** 11792 1727096139.97704: entering _queue_task() for managed_node2/set_fact 11792 1727096139.98085: worker is 1 (out of 1 available) 11792 1727096139.98097: exiting _queue_task() for managed_node2/set_fact 11792 1727096139.98109: done queuing things up, now waiting for results queue to drain 11792 1727096139.98111: waiting for pending results... 11792 1727096139.98588: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 11792 1727096139.98594: in run() - task 0afff68d-5257-d9c7-3fc0-000000000503 11792 1727096139.98597: variable 'ansible_search_path' from source: unknown 11792 1727096139.98599: variable 'ansible_search_path' from source: unknown 11792 1727096139.98702: calling self._execute() 11792 1727096139.98730: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096139.98742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096139.98758: variable 'omit' from source: magic vars 11792 1727096139.99143: variable 'ansible_distribution_major_version' from source: facts 11792 1727096139.99166: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096139.99300: variable 'profile_stat' from source: set_fact 11792 1727096139.99317: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096139.99325: when evaluation is False, skipping this task 11792 1727096139.99332: _execute() done 11792 1727096139.99340: dumping result to json 11792 1727096139.99355: done dumping result, returning 11792 1727096139.99460: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0afff68d-5257-d9c7-3fc0-000000000503] 11792 1727096139.99464: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000503 11792 1727096139.99525: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000503 11792 1727096139.99528: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096139.99583: no more pending results, returning what we have 11792 1727096139.99588: results queue empty 11792 1727096139.99589: checking for any_errors_fatal 11792 1727096139.99596: done checking for any_errors_fatal 11792 1727096139.99596: checking for max_fail_percentage 11792 1727096139.99598: done checking for max_fail_percentage 11792 1727096139.99600: checking to see if all hosts have failed and the running result is not ok 11792 1727096139.99600: done checking to see if all hosts have failed 11792 1727096139.99601: getting the remaining hosts for this loop 11792 1727096139.99603: done getting the remaining hosts for this loop 11792 1727096139.99607: getting the next task for host managed_node2 11792 1727096139.99618: done getting next task for host managed_node2 11792 1727096139.99620: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11792 1727096139.99627: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096139.99631: getting variables 11792 1727096139.99633: in VariableManager get_vars() 11792 1727096139.99674: Calling all_inventory to load vars for managed_node2 11792 1727096139.99677: Calling groups_inventory to load vars for managed_node2 11792 1727096139.99680: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096139.99695: Calling all_plugins_play to load vars for managed_node2 11792 1727096139.99699: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096139.99702: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.01484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.03264: done with get_vars() 11792 1727096140.03297: done getting variables 11792 1727096140.03370: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096140.03605: variable 'profile' from source: include params 11792 1727096140.03608: variable 'bond_port_profile' from source: include params 11792 1727096140.03665: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:55:40 -0400 (0:00:00.060) 0:00:22.316 ****** 11792 1727096140.03707: entering _queue_task() for managed_node2/command 11792 1727096140.04100: worker is 1 (out of 1 available) 11792 1727096140.04113: exiting _queue_task() for managed_node2/command 11792 1727096140.04125: done queuing things up, now waiting for results queue to drain 11792 1727096140.04127: waiting for pending results... 11792 1727096140.04564: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 11792 1727096140.04573: in run() - task 0afff68d-5257-d9c7-3fc0-000000000504 11792 1727096140.04577: variable 'ansible_search_path' from source: unknown 11792 1727096140.04580: variable 'ansible_search_path' from source: unknown 11792 1727096140.04583: calling self._execute() 11792 1727096140.04680: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.04686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.04695: variable 'omit' from source: magic vars 11792 1727096140.05103: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.05113: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.05244: variable 'profile_stat' from source: set_fact 11792 1727096140.05265: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096140.05270: when evaluation is False, skipping this task 11792 1727096140.05273: _execute() done 11792 1727096140.05276: dumping result to json 11792 1727096140.05278: done dumping result, returning 11792 1727096140.05309: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 [0afff68d-5257-d9c7-3fc0-000000000504] 11792 1727096140.05370: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000504 11792 1727096140.05644: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000504 11792 1727096140.05648: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096140.05699: no more pending results, returning what we have 11792 1727096140.05704: results queue empty 11792 1727096140.05705: checking for any_errors_fatal 11792 1727096140.05709: done checking for any_errors_fatal 11792 1727096140.05710: checking for max_fail_percentage 11792 1727096140.05712: done checking for max_fail_percentage 11792 1727096140.05713: checking to see if all hosts have failed and the running result is not ok 11792 1727096140.05714: done checking to see if all hosts have failed 11792 1727096140.05714: getting the remaining hosts for this loop 11792 1727096140.05716: done getting the remaining hosts for this loop 11792 1727096140.05719: getting the next task for host managed_node2 11792 1727096140.05726: done getting next task for host managed_node2 11792 1727096140.05728: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11792 1727096140.05733: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096140.05737: getting variables 11792 1727096140.05738: in VariableManager get_vars() 11792 1727096140.05765: Calling all_inventory to load vars for managed_node2 11792 1727096140.05770: Calling groups_inventory to load vars for managed_node2 11792 1727096140.05773: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.05783: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.05786: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.05789: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.07101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.09114: done with get_vars() 11792 1727096140.09146: done getting variables 11792 1727096140.09211: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096140.09326: variable 'profile' from source: include params 11792 1727096140.09330: variable 'bond_port_profile' from source: include params 11792 1727096140.09392: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:55:40 -0400 (0:00:00.057) 0:00:22.373 ****** 11792 1727096140.09429: entering _queue_task() for managed_node2/set_fact 11792 1727096140.09915: worker is 1 (out of 1 available) 11792 1727096140.09927: exiting _queue_task() for managed_node2/set_fact 11792 1727096140.09939: done queuing things up, now waiting for results queue to drain 11792 1727096140.09940: waiting for pending results... 11792 1727096140.10183: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 11792 1727096140.10311: in run() - task 0afff68d-5257-d9c7-3fc0-000000000505 11792 1727096140.10315: variable 'ansible_search_path' from source: unknown 11792 1727096140.10318: variable 'ansible_search_path' from source: unknown 11792 1727096140.10333: calling self._execute() 11792 1727096140.10432: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.10445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.10456: variable 'omit' from source: magic vars 11792 1727096140.10849: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.10927: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.10992: variable 'profile_stat' from source: set_fact 11792 1727096140.11006: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096140.11010: when evaluation is False, skipping this task 11792 1727096140.11012: _execute() done 11792 1727096140.11015: dumping result to json 11792 1727096140.11018: done dumping result, returning 11792 1727096140.11025: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 [0afff68d-5257-d9c7-3fc0-000000000505] 11792 1727096140.11036: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000505 11792 1727096140.11126: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000505 11792 1727096140.11129: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096140.11198: no more pending results, returning what we have 11792 1727096140.11203: results queue empty 11792 1727096140.11203: checking for any_errors_fatal 11792 1727096140.11212: done checking for any_errors_fatal 11792 1727096140.11213: checking for max_fail_percentage 11792 1727096140.11215: done checking for max_fail_percentage 11792 1727096140.11216: checking to see if all hosts have failed and the running result is not ok 11792 1727096140.11217: done checking to see if all hosts have failed 11792 1727096140.11218: getting the remaining hosts for this loop 11792 1727096140.11220: done getting the remaining hosts for this loop 11792 1727096140.11224: getting the next task for host managed_node2 11792 1727096140.11233: done getting next task for host managed_node2 11792 1727096140.11235: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11792 1727096140.11240: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096140.11245: getting variables 11792 1727096140.11246: in VariableManager get_vars() 11792 1727096140.11283: Calling all_inventory to load vars for managed_node2 11792 1727096140.11286: Calling groups_inventory to load vars for managed_node2 11792 1727096140.11290: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.11305: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.11308: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.11310: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.12857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.14464: done with get_vars() 11792 1727096140.14497: done getting variables 11792 1727096140.14561: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096140.14689: variable 'profile' from source: include params 11792 1727096140.14693: variable 'bond_port_profile' from source: include params 11792 1727096140.14756: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:55:40 -0400 (0:00:00.053) 0:00:22.427 ****** 11792 1727096140.14791: entering _queue_task() for managed_node2/assert 11792 1727096140.15271: worker is 1 (out of 1 available) 11792 1727096140.15283: exiting _queue_task() for managed_node2/assert 11792 1727096140.15293: done queuing things up, now waiting for results queue to drain 11792 1727096140.15295: waiting for pending results... 11792 1727096140.15588: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' 11792 1727096140.15594: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004da 11792 1727096140.15597: variable 'ansible_search_path' from source: unknown 11792 1727096140.15600: variable 'ansible_search_path' from source: unknown 11792 1727096140.15718: calling self._execute() 11792 1727096140.15733: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.15810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.15823: variable 'omit' from source: magic vars 11792 1727096140.16247: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.16250: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.16262: variable 'omit' from source: magic vars 11792 1727096140.16313: variable 'omit' from source: magic vars 11792 1727096140.16426: variable 'profile' from source: include params 11792 1727096140.16437: variable 'bond_port_profile' from source: include params 11792 1727096140.16505: variable 'bond_port_profile' from source: include params 11792 1727096140.16526: variable 'omit' from source: magic vars 11792 1727096140.16588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096140.16672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096140.16675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096140.16678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.16680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.16706: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096140.16709: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.16712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.16829: Set connection var ansible_timeout to 10 11792 1727096140.16837: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096140.16847: Set connection var ansible_shell_executable to /bin/sh 11792 1727096140.16854: Set connection var ansible_pipelining to False 11792 1727096140.16857: Set connection var ansible_shell_type to sh 11792 1727096140.16859: Set connection var ansible_connection to ssh 11792 1727096140.16891: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.16894: variable 'ansible_connection' from source: unknown 11792 1727096140.16896: variable 'ansible_module_compression' from source: unknown 11792 1727096140.16900: variable 'ansible_shell_type' from source: unknown 11792 1727096140.17078: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.17082: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.17085: variable 'ansible_pipelining' from source: unknown 11792 1727096140.17087: variable 'ansible_timeout' from source: unknown 11792 1727096140.17089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.17092: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096140.17094: variable 'omit' from source: magic vars 11792 1727096140.17096: starting attempt loop 11792 1727096140.17098: running the handler 11792 1727096140.17196: variable 'lsr_net_profile_exists' from source: set_fact 11792 1727096140.17199: Evaluated conditional (lsr_net_profile_exists): True 11792 1727096140.17212: handler run complete 11792 1727096140.17227: attempt loop complete, returning result 11792 1727096140.17230: _execute() done 11792 1727096140.17232: dumping result to json 11792 1727096140.17235: done dumping result, returning 11792 1727096140.17243: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' [0afff68d-5257-d9c7-3fc0-0000000004da] 11792 1727096140.17245: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004da ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096140.17392: no more pending results, returning what we have 11792 1727096140.17396: results queue empty 11792 1727096140.17397: checking for any_errors_fatal 11792 1727096140.17404: done checking for any_errors_fatal 11792 1727096140.17405: checking for max_fail_percentage 11792 1727096140.17407: done checking for max_fail_percentage 11792 1727096140.17408: checking to see if all hosts have failed and the running result is not ok 11792 1727096140.17409: done checking to see if all hosts have failed 11792 1727096140.17410: getting the remaining hosts for this loop 11792 1727096140.17411: done getting the remaining hosts for this loop 11792 1727096140.17415: getting the next task for host managed_node2 11792 1727096140.17539: done getting next task for host managed_node2 11792 1727096140.17542: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11792 1727096140.17547: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096140.17552: getting variables 11792 1727096140.17554: in VariableManager get_vars() 11792 1727096140.17590: Calling all_inventory to load vars for managed_node2 11792 1727096140.17593: Calling groups_inventory to load vars for managed_node2 11792 1727096140.17598: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.17612: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.17615: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.17619: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.18172: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004da 11792 1727096140.18179: WORKER PROCESS EXITING 11792 1727096140.19282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.20794: done with get_vars() 11792 1727096140.20828: done getting variables 11792 1727096140.20890: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096140.21014: variable 'profile' from source: include params 11792 1727096140.21022: variable 'bond_port_profile' from source: include params 11792 1727096140.21082: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:55:40 -0400 (0:00:00.063) 0:00:22.490 ****** 11792 1727096140.21115: entering _queue_task() for managed_node2/assert 11792 1727096140.21581: worker is 1 (out of 1 available) 11792 1727096140.21590: exiting _queue_task() for managed_node2/assert 11792 1727096140.21602: done queuing things up, now waiting for results queue to drain 11792 1727096140.21603: waiting for pending results... 11792 1727096140.21797: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' 11792 1727096140.22004: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004db 11792 1727096140.22008: variable 'ansible_search_path' from source: unknown 11792 1727096140.22012: variable 'ansible_search_path' from source: unknown 11792 1727096140.22015: calling self._execute() 11792 1727096140.22097: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.22119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.22133: variable 'omit' from source: magic vars 11792 1727096140.22507: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.22523: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.22535: variable 'omit' from source: magic vars 11792 1727096140.22606: variable 'omit' from source: magic vars 11792 1727096140.22769: variable 'profile' from source: include params 11792 1727096140.22773: variable 'bond_port_profile' from source: include params 11792 1727096140.22810: variable 'bond_port_profile' from source: include params 11792 1727096140.22837: variable 'omit' from source: magic vars 11792 1727096140.22892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096140.22937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096140.22963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096140.23073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.23076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.23079: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096140.23081: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.23089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.23173: Set connection var ansible_timeout to 10 11792 1727096140.23188: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096140.23210: Set connection var ansible_shell_executable to /bin/sh 11792 1727096140.23272: Set connection var ansible_pipelining to False 11792 1727096140.23275: Set connection var ansible_shell_type to sh 11792 1727096140.23278: Set connection var ansible_connection to ssh 11792 1727096140.23280: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.23282: variable 'ansible_connection' from source: unknown 11792 1727096140.23284: variable 'ansible_module_compression' from source: unknown 11792 1727096140.23286: variable 'ansible_shell_type' from source: unknown 11792 1727096140.23288: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.23290: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.23291: variable 'ansible_pipelining' from source: unknown 11792 1727096140.23294: variable 'ansible_timeout' from source: unknown 11792 1727096140.23296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.23452: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096140.23471: variable 'omit' from source: magic vars 11792 1727096140.23483: starting attempt loop 11792 1727096140.23490: running the handler 11792 1727096140.23639: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11792 1727096140.23642: Evaluated conditional (lsr_net_profile_ansible_managed): True 11792 1727096140.23644: handler run complete 11792 1727096140.23659: attempt loop complete, returning result 11792 1727096140.23667: _execute() done 11792 1727096140.23748: dumping result to json 11792 1727096140.23752: done dumping result, returning 11792 1727096140.23754: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' [0afff68d-5257-d9c7-3fc0-0000000004db] 11792 1727096140.23756: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004db 11792 1727096140.23825: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004db 11792 1727096140.23829: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096140.23904: no more pending results, returning what we have 11792 1727096140.23907: results queue empty 11792 1727096140.23908: checking for any_errors_fatal 11792 1727096140.23916: done checking for any_errors_fatal 11792 1727096140.23917: checking for max_fail_percentage 11792 1727096140.23919: done checking for max_fail_percentage 11792 1727096140.23921: checking to see if all hosts have failed and the running result is not ok 11792 1727096140.23921: done checking to see if all hosts have failed 11792 1727096140.23922: getting the remaining hosts for this loop 11792 1727096140.23923: done getting the remaining hosts for this loop 11792 1727096140.23927: getting the next task for host managed_node2 11792 1727096140.23936: done getting next task for host managed_node2 11792 1727096140.23939: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11792 1727096140.23944: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096140.23950: getting variables 11792 1727096140.23951: in VariableManager get_vars() 11792 1727096140.23989: Calling all_inventory to load vars for managed_node2 11792 1727096140.23992: Calling groups_inventory to load vars for managed_node2 11792 1727096140.23996: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.24008: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.24011: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.24014: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.25625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.27163: done with get_vars() 11792 1727096140.27190: done getting variables 11792 1727096140.27252: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096140.27377: variable 'profile' from source: include params 11792 1727096140.27381: variable 'bond_port_profile' from source: include params 11792 1727096140.27440: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:55:40 -0400 (0:00:00.063) 0:00:22.553 ****** 11792 1727096140.27473: entering _queue_task() for managed_node2/assert 11792 1727096140.27893: worker is 1 (out of 1 available) 11792 1727096140.27904: exiting _queue_task() for managed_node2/assert 11792 1727096140.27916: done queuing things up, now waiting for results queue to drain 11792 1727096140.27917: waiting for pending results... 11792 1727096140.28293: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 11792 1727096140.28349: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004dc 11792 1727096140.28376: variable 'ansible_search_path' from source: unknown 11792 1727096140.28385: variable 'ansible_search_path' from source: unknown 11792 1727096140.28436: calling self._execute() 11792 1727096140.28545: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.28558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.28622: variable 'omit' from source: magic vars 11792 1727096140.28982: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.29000: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.29013: variable 'omit' from source: magic vars 11792 1727096140.29082: variable 'omit' from source: magic vars 11792 1727096140.29194: variable 'profile' from source: include params 11792 1727096140.29204: variable 'bond_port_profile' from source: include params 11792 1727096140.29373: variable 'bond_port_profile' from source: include params 11792 1727096140.29378: variable 'omit' from source: magic vars 11792 1727096140.29381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096140.29499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096140.29503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096140.29505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.29507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.29510: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096140.29512: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.29514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.29612: Set connection var ansible_timeout to 10 11792 1727096140.29627: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096140.29642: Set connection var ansible_shell_executable to /bin/sh 11792 1727096140.29652: Set connection var ansible_pipelining to False 11792 1727096140.29660: Set connection var ansible_shell_type to sh 11792 1727096140.29669: Set connection var ansible_connection to ssh 11792 1727096140.29697: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.29704: variable 'ansible_connection' from source: unknown 11792 1727096140.29719: variable 'ansible_module_compression' from source: unknown 11792 1727096140.29727: variable 'ansible_shell_type' from source: unknown 11792 1727096140.29734: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.29742: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.29750: variable 'ansible_pipelining' from source: unknown 11792 1727096140.29758: variable 'ansible_timeout' from source: unknown 11792 1727096140.29765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.29914: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096140.29938: variable 'omit' from source: magic vars 11792 1727096140.29949: starting attempt loop 11792 1727096140.30044: running the handler 11792 1727096140.30081: variable 'lsr_net_profile_fingerprint' from source: set_fact 11792 1727096140.30092: Evaluated conditional (lsr_net_profile_fingerprint): True 11792 1727096140.30103: handler run complete 11792 1727096140.30123: attempt loop complete, returning result 11792 1727096140.30130: _execute() done 11792 1727096140.30136: dumping result to json 11792 1727096140.30150: done dumping result, returning 11792 1727096140.30166: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 [0afff68d-5257-d9c7-3fc0-0000000004dc] 11792 1727096140.30179: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004dc ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096140.30434: no more pending results, returning what we have 11792 1727096140.30437: results queue empty 11792 1727096140.30438: checking for any_errors_fatal 11792 1727096140.30445: done checking for any_errors_fatal 11792 1727096140.30446: checking for max_fail_percentage 11792 1727096140.30448: done checking for max_fail_percentage 11792 1727096140.30449: checking to see if all hosts have failed and the running result is not ok 11792 1727096140.30450: done checking to see if all hosts have failed 11792 1727096140.30451: getting the remaining hosts for this loop 11792 1727096140.30452: done getting the remaining hosts for this loop 11792 1727096140.30455: getting the next task for host managed_node2 11792 1727096140.30465: done getting next task for host managed_node2 11792 1727096140.30469: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11792 1727096140.30474: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096140.30478: getting variables 11792 1727096140.30480: in VariableManager get_vars() 11792 1727096140.30513: Calling all_inventory to load vars for managed_node2 11792 1727096140.30515: Calling groups_inventory to load vars for managed_node2 11792 1727096140.30519: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.30531: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.30534: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.30536: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.31107: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004dc 11792 1727096140.31116: WORKER PROCESS EXITING 11792 1727096140.37305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.39147: done with get_vars() 11792 1727096140.39184: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:55:40 -0400 (0:00:00.118) 0:00:22.672 ****** 11792 1727096140.39285: entering _queue_task() for managed_node2/include_tasks 11792 1727096140.39654: worker is 1 (out of 1 available) 11792 1727096140.39873: exiting _queue_task() for managed_node2/include_tasks 11792 1727096140.39886: done queuing things up, now waiting for results queue to drain 11792 1727096140.39889: waiting for pending results... 11792 1727096140.39990: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11792 1727096140.40155: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004e0 11792 1727096140.40180: variable 'ansible_search_path' from source: unknown 11792 1727096140.40188: variable 'ansible_search_path' from source: unknown 11792 1727096140.40243: calling self._execute() 11792 1727096140.40353: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.40370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.40419: variable 'omit' from source: magic vars 11792 1727096140.40808: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.40828: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.40840: _execute() done 11792 1727096140.40872: dumping result to json 11792 1727096140.40879: done dumping result, returning 11792 1727096140.40883: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-d9c7-3fc0-0000000004e0] 11792 1727096140.40887: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e0 11792 1727096140.41147: no more pending results, returning what we have 11792 1727096140.41153: in VariableManager get_vars() 11792 1727096140.41203: Calling all_inventory to load vars for managed_node2 11792 1727096140.41207: Calling groups_inventory to load vars for managed_node2 11792 1727096140.41211: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.41226: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.41229: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.41231: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.41884: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e0 11792 1727096140.41887: WORKER PROCESS EXITING 11792 1727096140.42785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.44376: done with get_vars() 11792 1727096140.44406: variable 'ansible_search_path' from source: unknown 11792 1727096140.44408: variable 'ansible_search_path' from source: unknown 11792 1727096140.44444: we have included files to process 11792 1727096140.44445: generating all_blocks data 11792 1727096140.44447: done generating all_blocks data 11792 1727096140.44452: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096140.44454: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096140.44455: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096140.45544: done processing included file 11792 1727096140.45546: iterating over new_blocks loaded from include file 11792 1727096140.45548: in VariableManager get_vars() 11792 1727096140.45573: done with get_vars() 11792 1727096140.45575: filtering new block on tags 11792 1727096140.45671: done filtering new block on tags 11792 1727096140.45675: in VariableManager get_vars() 11792 1727096140.45693: done with get_vars() 11792 1727096140.45695: filtering new block on tags 11792 1727096140.45772: done filtering new block on tags 11792 1727096140.45775: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11792 1727096140.45781: extending task lists for all hosts with included blocks 11792 1727096140.46398: done extending task lists 11792 1727096140.46400: done processing included files 11792 1727096140.46401: results queue empty 11792 1727096140.46402: checking for any_errors_fatal 11792 1727096140.46406: done checking for any_errors_fatal 11792 1727096140.46407: checking for max_fail_percentage 11792 1727096140.46408: done checking for max_fail_percentage 11792 1727096140.46409: checking to see if all hosts have failed and the running result is not ok 11792 1727096140.46410: done checking to see if all hosts have failed 11792 1727096140.46410: getting the remaining hosts for this loop 11792 1727096140.46412: done getting the remaining hosts for this loop 11792 1727096140.46414: getting the next task for host managed_node2 11792 1727096140.46419: done getting next task for host managed_node2 11792 1727096140.46422: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11792 1727096140.46425: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096140.46428: getting variables 11792 1727096140.46429: in VariableManager get_vars() 11792 1727096140.46439: Calling all_inventory to load vars for managed_node2 11792 1727096140.46442: Calling groups_inventory to load vars for managed_node2 11792 1727096140.46444: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.46450: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.46456: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.46460: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.47646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.49469: done with get_vars() 11792 1727096140.49506: done getting variables 11792 1727096140.49554: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:55:40 -0400 (0:00:00.103) 0:00:22.775 ****** 11792 1727096140.49595: entering _queue_task() for managed_node2/set_fact 11792 1727096140.50136: worker is 1 (out of 1 available) 11792 1727096140.50146: exiting _queue_task() for managed_node2/set_fact 11792 1727096140.50163: done queuing things up, now waiting for results queue to drain 11792 1727096140.50165: waiting for pending results... 11792 1727096140.50339: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11792 1727096140.50480: in run() - task 0afff68d-5257-d9c7-3fc0-000000000558 11792 1727096140.50498: variable 'ansible_search_path' from source: unknown 11792 1727096140.50502: variable 'ansible_search_path' from source: unknown 11792 1727096140.50549: calling self._execute() 11792 1727096140.50650: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.50660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.50676: variable 'omit' from source: magic vars 11792 1727096140.51118: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.51129: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.51137: variable 'omit' from source: magic vars 11792 1727096140.51211: variable 'omit' from source: magic vars 11792 1727096140.51328: variable 'omit' from source: magic vars 11792 1727096140.51332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096140.51335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096140.51363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096140.51390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.51401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.51437: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096140.51441: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.51447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.51563: Set connection var ansible_timeout to 10 11792 1727096140.51772: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096140.51776: Set connection var ansible_shell_executable to /bin/sh 11792 1727096140.51778: Set connection var ansible_pipelining to False 11792 1727096140.51781: Set connection var ansible_shell_type to sh 11792 1727096140.51783: Set connection var ansible_connection to ssh 11792 1727096140.51785: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.51787: variable 'ansible_connection' from source: unknown 11792 1727096140.51790: variable 'ansible_module_compression' from source: unknown 11792 1727096140.51792: variable 'ansible_shell_type' from source: unknown 11792 1727096140.51795: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.51797: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.51799: variable 'ansible_pipelining' from source: unknown 11792 1727096140.51800: variable 'ansible_timeout' from source: unknown 11792 1727096140.51802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.51805: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096140.51808: variable 'omit' from source: magic vars 11792 1727096140.51821: starting attempt loop 11792 1727096140.51824: running the handler 11792 1727096140.51839: handler run complete 11792 1727096140.51851: attempt loop complete, returning result 11792 1727096140.51857: _execute() done 11792 1727096140.51859: dumping result to json 11792 1727096140.51862: done dumping result, returning 11792 1727096140.51877: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-d9c7-3fc0-000000000558] 11792 1727096140.51880: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000558 11792 1727096140.52139: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000558 11792 1727096140.52145: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11792 1727096140.52287: no more pending results, returning what we have 11792 1727096140.52291: results queue empty 11792 1727096140.52292: checking for any_errors_fatal 11792 1727096140.52298: done checking for any_errors_fatal 11792 1727096140.52299: checking for max_fail_percentage 11792 1727096140.52301: done checking for max_fail_percentage 11792 1727096140.52302: checking to see if all hosts have failed and the running result is not ok 11792 1727096140.52303: done checking to see if all hosts have failed 11792 1727096140.52304: getting the remaining hosts for this loop 11792 1727096140.52305: done getting the remaining hosts for this loop 11792 1727096140.52309: getting the next task for host managed_node2 11792 1727096140.52317: done getting next task for host managed_node2 11792 1727096140.52319: ^ task is: TASK: Stat profile file 11792 1727096140.52325: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096140.52329: getting variables 11792 1727096140.52330: in VariableManager get_vars() 11792 1727096140.52363: Calling all_inventory to load vars for managed_node2 11792 1727096140.52366: Calling groups_inventory to load vars for managed_node2 11792 1727096140.52371: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.52382: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.52385: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.52389: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.53913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.55580: done with get_vars() 11792 1727096140.55611: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:55:40 -0400 (0:00:00.061) 0:00:22.836 ****** 11792 1727096140.55728: entering _queue_task() for managed_node2/stat 11792 1727096140.56291: worker is 1 (out of 1 available) 11792 1727096140.56302: exiting _queue_task() for managed_node2/stat 11792 1727096140.56314: done queuing things up, now waiting for results queue to drain 11792 1727096140.56316: waiting for pending results... 11792 1727096140.56464: running TaskExecutor() for managed_node2/TASK: Stat profile file 11792 1727096140.56625: in run() - task 0afff68d-5257-d9c7-3fc0-000000000559 11792 1727096140.56631: variable 'ansible_search_path' from source: unknown 11792 1727096140.56636: variable 'ansible_search_path' from source: unknown 11792 1727096140.56670: calling self._execute() 11792 1727096140.56766: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.56775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.56785: variable 'omit' from source: magic vars 11792 1727096140.57214: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.57275: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.57278: variable 'omit' from source: magic vars 11792 1727096140.57316: variable 'omit' from source: magic vars 11792 1727096140.57423: variable 'profile' from source: include params 11792 1727096140.57428: variable 'bond_port_profile' from source: include params 11792 1727096140.57492: variable 'bond_port_profile' from source: include params 11792 1727096140.57519: variable 'omit' from source: magic vars 11792 1727096140.57565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096140.57603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096140.57630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096140.57648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.57662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096140.57696: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096140.57699: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.57702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.57817: Set connection var ansible_timeout to 10 11792 1727096140.57821: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096140.57835: Set connection var ansible_shell_executable to /bin/sh 11792 1727096140.57845: Set connection var ansible_pipelining to False 11792 1727096140.57848: Set connection var ansible_shell_type to sh 11792 1727096140.57851: Set connection var ansible_connection to ssh 11792 1727096140.57880: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.57884: variable 'ansible_connection' from source: unknown 11792 1727096140.57886: variable 'ansible_module_compression' from source: unknown 11792 1727096140.57889: variable 'ansible_shell_type' from source: unknown 11792 1727096140.57891: variable 'ansible_shell_executable' from source: unknown 11792 1727096140.57894: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.57896: variable 'ansible_pipelining' from source: unknown 11792 1727096140.57900: variable 'ansible_timeout' from source: unknown 11792 1727096140.57927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.58123: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096140.58145: variable 'omit' from source: magic vars 11792 1727096140.58148: starting attempt loop 11792 1727096140.58151: running the handler 11792 1727096140.58254: _low_level_execute_command(): starting 11792 1727096140.58257: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096140.58941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096140.58983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096140.58991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096140.59003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096140.59041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096140.59089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096140.59111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096140.59178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096140.60913: stdout chunk (state=3): >>>/root <<< 11792 1727096140.61026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096140.61059: stderr chunk (state=3): >>><<< 11792 1727096140.61084: stdout chunk (state=3): >>><<< 11792 1727096140.61276: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096140.61280: _low_level_execute_command(): starting 11792 1727096140.61284: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663 `" && echo ansible-tmp-1727096140.6110723-12838-143596391636663="` echo /root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663 `" ) && sleep 0' 11792 1727096140.61846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096140.61856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096140.61870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096140.61893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096140.61910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096140.61921: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096140.61948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096140.61954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096140.61957: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096140.61960: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096140.62060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096140.62063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096140.62066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096140.62070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096140.62073: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096140.62078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096140.62094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096140.62118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096140.62201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096140.64234: stdout chunk (state=3): >>>ansible-tmp-1727096140.6110723-12838-143596391636663=/root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663 <<< 11792 1727096140.64423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096140.64427: stdout chunk (state=3): >>><<< 11792 1727096140.64430: stderr chunk (state=3): >>><<< 11792 1727096140.64770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096140.6110723-12838-143596391636663=/root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096140.64775: variable 'ansible_module_compression' from source: unknown 11792 1727096140.64788: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11792 1727096140.64834: variable 'ansible_facts' from source: unknown 11792 1727096140.64933: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/AnsiballZ_stat.py 11792 1727096140.65089: Sending initial data 11792 1727096140.65120: Sent initial data (153 bytes) 11792 1727096140.65892: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096140.65909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096140.65926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096140.66005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096140.67683: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096140.67738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096140.67800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpzrh95zii /root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/AnsiballZ_stat.py <<< 11792 1727096140.67830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpzrh95zii" to remote "/root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/AnsiballZ_stat.py" <<< 11792 1727096140.68592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096140.68639: stderr chunk (state=3): >>><<< 11792 1727096140.68653: stdout chunk (state=3): >>><<< 11792 1727096140.68686: done transferring module to remote 11792 1727096140.68701: _low_level_execute_command(): starting 11792 1727096140.68711: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/ /root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/AnsiballZ_stat.py && sleep 0' 11792 1727096140.69419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096140.69487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096140.69560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096140.69578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096140.69629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096140.69779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096140.71692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096140.71703: stdout chunk (state=3): >>><<< 11792 1727096140.71727: stderr chunk (state=3): >>><<< 11792 1727096140.71742: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096140.71749: _low_level_execute_command(): starting 11792 1727096140.71830: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/AnsiballZ_stat.py && sleep 0' 11792 1727096140.72392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096140.72421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096140.72483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096140.72557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096140.72576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096140.72607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096140.72683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096140.88836: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11792 1727096140.90579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096140.90584: stdout chunk (state=3): >>><<< 11792 1727096140.90587: stderr chunk (state=3): >>><<< 11792 1727096140.90590: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096140.90594: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096140.90598: _low_level_execute_command(): starting 11792 1727096140.90600: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096140.6110723-12838-143596391636663/ > /dev/null 2>&1 && sleep 0' 11792 1727096140.91457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096140.91483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096140.91586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096140.91608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096140.91630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096140.91655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096140.91730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096140.93736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096140.93766: stdout chunk (state=3): >>><<< 11792 1727096140.93783: stderr chunk (state=3): >>><<< 11792 1727096140.93973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096140.93977: handler run complete 11792 1727096140.93980: attempt loop complete, returning result 11792 1727096140.93982: _execute() done 11792 1727096140.93984: dumping result to json 11792 1727096140.93986: done dumping result, returning 11792 1727096140.93988: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0afff68d-5257-d9c7-3fc0-000000000559] 11792 1727096140.93990: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000559 11792 1727096140.94064: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000559 11792 1727096140.94070: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11792 1727096140.94137: no more pending results, returning what we have 11792 1727096140.94141: results queue empty 11792 1727096140.94142: checking for any_errors_fatal 11792 1727096140.94149: done checking for any_errors_fatal 11792 1727096140.94150: checking for max_fail_percentage 11792 1727096140.94155: done checking for max_fail_percentage 11792 1727096140.94156: checking to see if all hosts have failed and the running result is not ok 11792 1727096140.94156: done checking to see if all hosts have failed 11792 1727096140.94157: getting the remaining hosts for this loop 11792 1727096140.94159: done getting the remaining hosts for this loop 11792 1727096140.94163: getting the next task for host managed_node2 11792 1727096140.94187: done getting next task for host managed_node2 11792 1727096140.94190: ^ task is: TASK: Set NM profile exist flag based on the profile files 11792 1727096140.94196: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096140.94201: getting variables 11792 1727096140.94203: in VariableManager get_vars() 11792 1727096140.94239: Calling all_inventory to load vars for managed_node2 11792 1727096140.94243: Calling groups_inventory to load vars for managed_node2 11792 1727096140.94246: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096140.94262: Calling all_plugins_play to load vars for managed_node2 11792 1727096140.94266: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096140.94375: Calling groups_plugins_play to load vars for managed_node2 11792 1727096140.96233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096140.98033: done with get_vars() 11792 1727096140.98069: done getting variables 11792 1727096140.98138: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:55:40 -0400 (0:00:00.424) 0:00:23.260 ****** 11792 1727096140.98177: entering _queue_task() for managed_node2/set_fact 11792 1727096140.98553: worker is 1 (out of 1 available) 11792 1727096140.98565: exiting _queue_task() for managed_node2/set_fact 11792 1727096140.98580: done queuing things up, now waiting for results queue to drain 11792 1727096140.98582: waiting for pending results... 11792 1727096140.98995: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11792 1727096140.99092: in run() - task 0afff68d-5257-d9c7-3fc0-00000000055a 11792 1727096140.99097: variable 'ansible_search_path' from source: unknown 11792 1727096140.99099: variable 'ansible_search_path' from source: unknown 11792 1727096140.99120: calling self._execute() 11792 1727096140.99235: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096140.99249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096140.99304: variable 'omit' from source: magic vars 11792 1727096140.99740: variable 'ansible_distribution_major_version' from source: facts 11792 1727096140.99744: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096140.99959: variable 'profile_stat' from source: set_fact 11792 1727096140.99965: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096140.99968: when evaluation is False, skipping this task 11792 1727096140.99971: _execute() done 11792 1727096140.99973: dumping result to json 11792 1727096140.99975: done dumping result, returning 11792 1727096140.99978: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-d9c7-3fc0-00000000055a] 11792 1727096140.99980: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055a 11792 1727096141.00054: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055a skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096141.00114: no more pending results, returning what we have 11792 1727096141.00119: results queue empty 11792 1727096141.00120: checking for any_errors_fatal 11792 1727096141.00129: done checking for any_errors_fatal 11792 1727096141.00130: checking for max_fail_percentage 11792 1727096141.00133: done checking for max_fail_percentage 11792 1727096141.00133: checking to see if all hosts have failed and the running result is not ok 11792 1727096141.00134: done checking to see if all hosts have failed 11792 1727096141.00135: getting the remaining hosts for this loop 11792 1727096141.00137: done getting the remaining hosts for this loop 11792 1727096141.00141: getting the next task for host managed_node2 11792 1727096141.00150: done getting next task for host managed_node2 11792 1727096141.00154: ^ task is: TASK: Get NM profile info 11792 1727096141.00162: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096141.00170: getting variables 11792 1727096141.00171: in VariableManager get_vars() 11792 1727096141.00203: Calling all_inventory to load vars for managed_node2 11792 1727096141.00206: Calling groups_inventory to load vars for managed_node2 11792 1727096141.00209: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096141.00222: Calling all_plugins_play to load vars for managed_node2 11792 1727096141.00224: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096141.00226: Calling groups_plugins_play to load vars for managed_node2 11792 1727096141.00882: WORKER PROCESS EXITING 11792 1727096141.02172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096141.03866: done with get_vars() 11792 1727096141.03896: done getting variables 11792 1727096141.03965: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:55:41 -0400 (0:00:00.058) 0:00:23.319 ****** 11792 1727096141.04002: entering _queue_task() for managed_node2/shell 11792 1727096141.04480: worker is 1 (out of 1 available) 11792 1727096141.04490: exiting _queue_task() for managed_node2/shell 11792 1727096141.04508: done queuing things up, now waiting for results queue to drain 11792 1727096141.04510: waiting for pending results... 11792 1727096141.04735: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11792 1727096141.04939: in run() - task 0afff68d-5257-d9c7-3fc0-00000000055b 11792 1727096141.04943: variable 'ansible_search_path' from source: unknown 11792 1727096141.04946: variable 'ansible_search_path' from source: unknown 11792 1727096141.04973: calling self._execute() 11792 1727096141.05086: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.05099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.05113: variable 'omit' from source: magic vars 11792 1727096141.05527: variable 'ansible_distribution_major_version' from source: facts 11792 1727096141.05545: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096141.05588: variable 'omit' from source: magic vars 11792 1727096141.05639: variable 'omit' from source: magic vars 11792 1727096141.05747: variable 'profile' from source: include params 11792 1727096141.05803: variable 'bond_port_profile' from source: include params 11792 1727096141.05830: variable 'bond_port_profile' from source: include params 11792 1727096141.05856: variable 'omit' from source: magic vars 11792 1727096141.05908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096141.05954: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096141.05982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096141.06020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096141.06032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096141.06130: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096141.06134: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.06137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.06199: Set connection var ansible_timeout to 10 11792 1727096141.06238: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096141.06242: Set connection var ansible_shell_executable to /bin/sh 11792 1727096141.06244: Set connection var ansible_pipelining to False 11792 1727096141.06254: Set connection var ansible_shell_type to sh 11792 1727096141.06273: Set connection var ansible_connection to ssh 11792 1727096141.06297: variable 'ansible_shell_executable' from source: unknown 11792 1727096141.06347: variable 'ansible_connection' from source: unknown 11792 1727096141.06353: variable 'ansible_module_compression' from source: unknown 11792 1727096141.06360: variable 'ansible_shell_type' from source: unknown 11792 1727096141.06363: variable 'ansible_shell_executable' from source: unknown 11792 1727096141.06365: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.06368: variable 'ansible_pipelining' from source: unknown 11792 1727096141.06371: variable 'ansible_timeout' from source: unknown 11792 1727096141.06373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.06510: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096141.06525: variable 'omit' from source: magic vars 11792 1727096141.06570: starting attempt loop 11792 1727096141.06576: running the handler 11792 1727096141.06579: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096141.06591: _low_level_execute_command(): starting 11792 1727096141.06604: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096141.07362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096141.07405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096141.07446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096141.07493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096141.07554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096141.09258: stdout chunk (state=3): >>>/root <<< 11792 1727096141.09419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096141.09423: stdout chunk (state=3): >>><<< 11792 1727096141.09425: stderr chunk (state=3): >>><<< 11792 1727096141.09562: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096141.09566: _low_level_execute_command(): starting 11792 1727096141.09571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240 `" && echo ansible-tmp-1727096141.0946255-12865-88458634361240="` echo /root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240 `" ) && sleep 0' 11792 1727096141.10206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096141.10223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096141.10372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096141.10376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096141.10399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096141.10480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096141.12472: stdout chunk (state=3): >>>ansible-tmp-1727096141.0946255-12865-88458634361240=/root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240 <<< 11792 1727096141.12635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096141.12639: stdout chunk (state=3): >>><<< 11792 1727096141.12641: stderr chunk (state=3): >>><<< 11792 1727096141.12664: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096141.0946255-12865-88458634361240=/root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096141.12877: variable 'ansible_module_compression' from source: unknown 11792 1727096141.12881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096141.12883: variable 'ansible_facts' from source: unknown 11792 1727096141.12901: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/AnsiballZ_command.py 11792 1727096141.13130: Sending initial data 11792 1727096141.13138: Sent initial data (155 bytes) 11792 1727096141.13661: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096141.13678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096141.13691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096141.13785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096141.13800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096141.13828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096141.13904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096141.15589: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096141.15660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096141.15750: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpxdre7u0c /root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/AnsiballZ_command.py <<< 11792 1727096141.15764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/AnsiballZ_command.py" <<< 11792 1727096141.15805: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpxdre7u0c" to remote "/root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/AnsiballZ_command.py" <<< 11792 1727096141.16606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096141.16656: stderr chunk (state=3): >>><<< 11792 1727096141.16670: stdout chunk (state=3): >>><<< 11792 1727096141.16699: done transferring module to remote 11792 1727096141.16715: _low_level_execute_command(): starting 11792 1727096141.16747: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/ /root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/AnsiballZ_command.py && sleep 0' 11792 1727096141.17437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096141.17580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096141.17588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096141.17622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096141.17635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096141.17703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096141.19640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096141.19644: stdout chunk (state=3): >>><<< 11792 1727096141.19649: stderr chunk (state=3): >>><<< 11792 1727096141.19670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096141.19674: _low_level_execute_command(): starting 11792 1727096141.19679: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/AnsiballZ_command.py && sleep 0' 11792 1727096141.20327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096141.20338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096141.20355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096141.20405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096141.20408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096141.20411: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096141.20462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096141.20490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096141.20513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096141.20547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096141.20599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096141.38580: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-23 08:55:41.362057", "end": "2024-09-23 08:55:41.383256", "delta": "0:00:00.021199", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096141.40532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096141.40536: stdout chunk (state=3): >>><<< 11792 1727096141.40538: stderr chunk (state=3): >>><<< 11792 1727096141.40541: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-23 08:55:41.362057", "end": "2024-09-23 08:55:41.383256", "delta": "0:00:00.021199", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096141.40544: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096141.40546: _low_level_execute_command(): starting 11792 1727096141.40549: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096141.0946255-12865-88458634361240/ > /dev/null 2>&1 && sleep 0' 11792 1727096141.41851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096141.41856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096141.41859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096141.41862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096141.41989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096141.42029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096141.43954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096141.44073: stderr chunk (state=3): >>><<< 11792 1727096141.44077: stdout chunk (state=3): >>><<< 11792 1727096141.44283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096141.44287: handler run complete 11792 1727096141.44289: Evaluated conditional (False): False 11792 1727096141.44291: attempt loop complete, returning result 11792 1727096141.44293: _execute() done 11792 1727096141.44295: dumping result to json 11792 1727096141.44297: done dumping result, returning 11792 1727096141.44299: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0afff68d-5257-d9c7-3fc0-00000000055b] 11792 1727096141.44301: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055b 11792 1727096141.44370: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055b ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.021199", "end": "2024-09-23 08:55:41.383256", "rc": 0, "start": "2024-09-23 08:55:41.362057" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11792 1727096141.44443: no more pending results, returning what we have 11792 1727096141.44447: results queue empty 11792 1727096141.44448: checking for any_errors_fatal 11792 1727096141.44457: done checking for any_errors_fatal 11792 1727096141.44458: checking for max_fail_percentage 11792 1727096141.44460: done checking for max_fail_percentage 11792 1727096141.44461: checking to see if all hosts have failed and the running result is not ok 11792 1727096141.44462: done checking to see if all hosts have failed 11792 1727096141.44462: getting the remaining hosts for this loop 11792 1727096141.44466: done getting the remaining hosts for this loop 11792 1727096141.44472: getting the next task for host managed_node2 11792 1727096141.44482: done getting next task for host managed_node2 11792 1727096141.44484: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11792 1727096141.44491: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096141.44495: getting variables 11792 1727096141.44496: in VariableManager get_vars() 11792 1727096141.44527: Calling all_inventory to load vars for managed_node2 11792 1727096141.44530: Calling groups_inventory to load vars for managed_node2 11792 1727096141.44533: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096141.44545: Calling all_plugins_play to load vars for managed_node2 11792 1727096141.44547: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096141.44550: Calling groups_plugins_play to load vars for managed_node2 11792 1727096141.45374: WORKER PROCESS EXITING 11792 1727096141.47153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096141.50006: done with get_vars() 11792 1727096141.50043: done getting variables 11792 1727096141.50112: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:55:41 -0400 (0:00:00.461) 0:00:23.780 ****** 11792 1727096141.50146: entering _queue_task() for managed_node2/set_fact 11792 1727096141.50937: worker is 1 (out of 1 available) 11792 1727096141.50949: exiting _queue_task() for managed_node2/set_fact 11792 1727096141.50966: done queuing things up, now waiting for results queue to drain 11792 1727096141.51271: waiting for pending results... 11792 1727096141.51548: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11792 1727096141.51720: in run() - task 0afff68d-5257-d9c7-3fc0-00000000055c 11792 1727096141.51748: variable 'ansible_search_path' from source: unknown 11792 1727096141.51759: variable 'ansible_search_path' from source: unknown 11792 1727096141.51805: calling self._execute() 11792 1727096141.51915: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.51929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.51944: variable 'omit' from source: magic vars 11792 1727096141.52328: variable 'ansible_distribution_major_version' from source: facts 11792 1727096141.52347: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096141.52488: variable 'nm_profile_exists' from source: set_fact 11792 1727096141.52507: Evaluated conditional (nm_profile_exists.rc == 0): True 11792 1727096141.52520: variable 'omit' from source: magic vars 11792 1727096141.52588: variable 'omit' from source: magic vars 11792 1727096141.52626: variable 'omit' from source: magic vars 11792 1727096141.52679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096141.52722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096141.52748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096141.52778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096141.52796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096141.52830: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096141.52839: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.52845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.52941: Set connection var ansible_timeout to 10 11792 1727096141.52956: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096141.52971: Set connection var ansible_shell_executable to /bin/sh 11792 1727096141.52980: Set connection var ansible_pipelining to False 11792 1727096141.52985: Set connection var ansible_shell_type to sh 11792 1727096141.52990: Set connection var ansible_connection to ssh 11792 1727096141.53019: variable 'ansible_shell_executable' from source: unknown 11792 1727096141.53030: variable 'ansible_connection' from source: unknown 11792 1727096141.53038: variable 'ansible_module_compression' from source: unknown 11792 1727096141.53045: variable 'ansible_shell_type' from source: unknown 11792 1727096141.53055: variable 'ansible_shell_executable' from source: unknown 11792 1727096141.53124: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.53128: variable 'ansible_pipelining' from source: unknown 11792 1727096141.53130: variable 'ansible_timeout' from source: unknown 11792 1727096141.53133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.53881: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096141.53886: variable 'omit' from source: magic vars 11792 1727096141.53888: starting attempt loop 11792 1727096141.53892: running the handler 11792 1727096141.53894: handler run complete 11792 1727096141.53896: attempt loop complete, returning result 11792 1727096141.53898: _execute() done 11792 1727096141.53899: dumping result to json 11792 1727096141.53901: done dumping result, returning 11792 1727096141.53904: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-d9c7-3fc0-00000000055c] 11792 1727096141.53905: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055c 11792 1727096141.54374: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055c ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11792 1727096141.54434: no more pending results, returning what we have 11792 1727096141.54440: results queue empty 11792 1727096141.54441: checking for any_errors_fatal 11792 1727096141.54455: done checking for any_errors_fatal 11792 1727096141.54456: checking for max_fail_percentage 11792 1727096141.54458: done checking for max_fail_percentage 11792 1727096141.54462: checking to see if all hosts have failed and the running result is not ok 11792 1727096141.54462: done checking to see if all hosts have failed 11792 1727096141.54463: getting the remaining hosts for this loop 11792 1727096141.54465: done getting the remaining hosts for this loop 11792 1727096141.54470: getting the next task for host managed_node2 11792 1727096141.54483: done getting next task for host managed_node2 11792 1727096141.54485: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11792 1727096141.54492: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096141.54495: getting variables 11792 1727096141.54496: in VariableManager get_vars() 11792 1727096141.54530: Calling all_inventory to load vars for managed_node2 11792 1727096141.54533: Calling groups_inventory to load vars for managed_node2 11792 1727096141.54536: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096141.54548: Calling all_plugins_play to load vars for managed_node2 11792 1727096141.54551: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096141.54554: Calling groups_plugins_play to load vars for managed_node2 11792 1727096141.55132: WORKER PROCESS EXITING 11792 1727096141.57272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096141.59223: done with get_vars() 11792 1727096141.59259: done getting variables 11792 1727096141.59336: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096141.59471: variable 'profile' from source: include params 11792 1727096141.59475: variable 'bond_port_profile' from source: include params 11792 1727096141.59544: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:55:41 -0400 (0:00:00.094) 0:00:23.874 ****** 11792 1727096141.59581: entering _queue_task() for managed_node2/command 11792 1727096141.60087: worker is 1 (out of 1 available) 11792 1727096141.60099: exiting _queue_task() for managed_node2/command 11792 1727096141.60111: done queuing things up, now waiting for results queue to drain 11792 1727096141.60113: waiting for pending results... 11792 1727096141.60300: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11792 1727096141.60533: in run() - task 0afff68d-5257-d9c7-3fc0-00000000055e 11792 1727096141.60547: variable 'ansible_search_path' from source: unknown 11792 1727096141.60551: variable 'ansible_search_path' from source: unknown 11792 1727096141.60658: calling self._execute() 11792 1727096141.60759: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.60765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.60795: variable 'omit' from source: magic vars 11792 1727096141.61187: variable 'ansible_distribution_major_version' from source: facts 11792 1727096141.61190: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096141.61306: variable 'profile_stat' from source: set_fact 11792 1727096141.61321: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096141.61324: when evaluation is False, skipping this task 11792 1727096141.61327: _execute() done 11792 1727096141.61330: dumping result to json 11792 1727096141.61333: done dumping result, returning 11792 1727096141.61338: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0afff68d-5257-d9c7-3fc0-00000000055e] 11792 1727096141.61343: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055e 11792 1727096141.61433: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055e 11792 1727096141.61436: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096141.61495: no more pending results, returning what we have 11792 1727096141.61500: results queue empty 11792 1727096141.61501: checking for any_errors_fatal 11792 1727096141.61510: done checking for any_errors_fatal 11792 1727096141.61511: checking for max_fail_percentage 11792 1727096141.61513: done checking for max_fail_percentage 11792 1727096141.61513: checking to see if all hosts have failed and the running result is not ok 11792 1727096141.61514: done checking to see if all hosts have failed 11792 1727096141.61515: getting the remaining hosts for this loop 11792 1727096141.61516: done getting the remaining hosts for this loop 11792 1727096141.61520: getting the next task for host managed_node2 11792 1727096141.61533: done getting next task for host managed_node2 11792 1727096141.61536: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11792 1727096141.61543: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096141.61548: getting variables 11792 1727096141.61549: in VariableManager get_vars() 11792 1727096141.61584: Calling all_inventory to load vars for managed_node2 11792 1727096141.61587: Calling groups_inventory to load vars for managed_node2 11792 1727096141.61591: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096141.61607: Calling all_plugins_play to load vars for managed_node2 11792 1727096141.61610: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096141.61614: Calling groups_plugins_play to load vars for managed_node2 11792 1727096141.63282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096141.65110: done with get_vars() 11792 1727096141.65135: done getting variables 11792 1727096141.65215: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096141.65345: variable 'profile' from source: include params 11792 1727096141.65349: variable 'bond_port_profile' from source: include params 11792 1727096141.65420: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:55:41 -0400 (0:00:00.058) 0:00:23.933 ****** 11792 1727096141.65456: entering _queue_task() for managed_node2/set_fact 11792 1727096141.65850: worker is 1 (out of 1 available) 11792 1727096141.65864: exiting _queue_task() for managed_node2/set_fact 11792 1727096141.65880: done queuing things up, now waiting for results queue to drain 11792 1727096141.65881: waiting for pending results... 11792 1727096141.66387: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11792 1727096141.66393: in run() - task 0afff68d-5257-d9c7-3fc0-00000000055f 11792 1727096141.66396: variable 'ansible_search_path' from source: unknown 11792 1727096141.66399: variable 'ansible_search_path' from source: unknown 11792 1727096141.66411: calling self._execute() 11792 1727096141.66520: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.66528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.66538: variable 'omit' from source: magic vars 11792 1727096141.66924: variable 'ansible_distribution_major_version' from source: facts 11792 1727096141.66935: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096141.67058: variable 'profile_stat' from source: set_fact 11792 1727096141.67081: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096141.67085: when evaluation is False, skipping this task 11792 1727096141.67088: _execute() done 11792 1727096141.67090: dumping result to json 11792 1727096141.67092: done dumping result, returning 11792 1727096141.67095: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0afff68d-5257-d9c7-3fc0-00000000055f] 11792 1727096141.67097: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055f 11792 1727096141.67190: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000055f 11792 1727096141.67193: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096141.67261: no more pending results, returning what we have 11792 1727096141.67266: results queue empty 11792 1727096141.67266: checking for any_errors_fatal 11792 1727096141.67274: done checking for any_errors_fatal 11792 1727096141.67275: checking for max_fail_percentage 11792 1727096141.67276: done checking for max_fail_percentage 11792 1727096141.67277: checking to see if all hosts have failed and the running result is not ok 11792 1727096141.67278: done checking to see if all hosts have failed 11792 1727096141.67278: getting the remaining hosts for this loop 11792 1727096141.67280: done getting the remaining hosts for this loop 11792 1727096141.67285: getting the next task for host managed_node2 11792 1727096141.67292: done getting next task for host managed_node2 11792 1727096141.67295: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11792 1727096141.67301: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096141.67304: getting variables 11792 1727096141.67306: in VariableManager get_vars() 11792 1727096141.67337: Calling all_inventory to load vars for managed_node2 11792 1727096141.67340: Calling groups_inventory to load vars for managed_node2 11792 1727096141.67343: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096141.67356: Calling all_plugins_play to load vars for managed_node2 11792 1727096141.67359: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096141.67361: Calling groups_plugins_play to load vars for managed_node2 11792 1727096141.69214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096141.70922: done with get_vars() 11792 1727096141.70963: done getting variables 11792 1727096141.71028: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096141.71155: variable 'profile' from source: include params 11792 1727096141.71159: variable 'bond_port_profile' from source: include params 11792 1727096141.71229: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:55:41 -0400 (0:00:00.058) 0:00:23.991 ****** 11792 1727096141.71264: entering _queue_task() for managed_node2/command 11792 1727096141.71656: worker is 1 (out of 1 available) 11792 1727096141.71672: exiting _queue_task() for managed_node2/command 11792 1727096141.71687: done queuing things up, now waiting for results queue to drain 11792 1727096141.71688: waiting for pending results... 11792 1727096141.72086: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 11792 1727096141.72096: in run() - task 0afff68d-5257-d9c7-3fc0-000000000560 11792 1727096141.72116: variable 'ansible_search_path' from source: unknown 11792 1727096141.72123: variable 'ansible_search_path' from source: unknown 11792 1727096141.72166: calling self._execute() 11792 1727096141.72261: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.72277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.72290: variable 'omit' from source: magic vars 11792 1727096141.72650: variable 'ansible_distribution_major_version' from source: facts 11792 1727096141.72674: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096141.72796: variable 'profile_stat' from source: set_fact 11792 1727096141.72813: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096141.72822: when evaluation is False, skipping this task 11792 1727096141.72830: _execute() done 11792 1727096141.72837: dumping result to json 11792 1727096141.72845: done dumping result, returning 11792 1727096141.72859: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0afff68d-5257-d9c7-3fc0-000000000560] 11792 1727096141.72871: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000560 11792 1727096141.73135: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000560 11792 1727096141.73139: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096141.73190: no more pending results, returning what we have 11792 1727096141.73194: results queue empty 11792 1727096141.73195: checking for any_errors_fatal 11792 1727096141.73200: done checking for any_errors_fatal 11792 1727096141.73201: checking for max_fail_percentage 11792 1727096141.73203: done checking for max_fail_percentage 11792 1727096141.73204: checking to see if all hosts have failed and the running result is not ok 11792 1727096141.73204: done checking to see if all hosts have failed 11792 1727096141.73205: getting the remaining hosts for this loop 11792 1727096141.73206: done getting the remaining hosts for this loop 11792 1727096141.73209: getting the next task for host managed_node2 11792 1727096141.73216: done getting next task for host managed_node2 11792 1727096141.73218: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11792 1727096141.73223: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096141.73227: getting variables 11792 1727096141.73228: in VariableManager get_vars() 11792 1727096141.73256: Calling all_inventory to load vars for managed_node2 11792 1727096141.73258: Calling groups_inventory to load vars for managed_node2 11792 1727096141.73261: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096141.73272: Calling all_plugins_play to load vars for managed_node2 11792 1727096141.73274: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096141.73277: Calling groups_plugins_play to load vars for managed_node2 11792 1727096141.75300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096141.78087: done with get_vars() 11792 1727096141.78197: done getting variables 11792 1727096141.78328: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096141.78569: variable 'profile' from source: include params 11792 1727096141.78574: variable 'bond_port_profile' from source: include params 11792 1727096141.78640: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:55:41 -0400 (0:00:00.075) 0:00:24.067 ****** 11792 1727096141.78793: entering _queue_task() for managed_node2/set_fact 11792 1727096141.79463: worker is 1 (out of 1 available) 11792 1727096141.79481: exiting _queue_task() for managed_node2/set_fact 11792 1727096141.79610: done queuing things up, now waiting for results queue to drain 11792 1727096141.79613: waiting for pending results... 11792 1727096141.79817: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11792 1727096141.79950: in run() - task 0afff68d-5257-d9c7-3fc0-000000000561 11792 1727096141.79974: variable 'ansible_search_path' from source: unknown 11792 1727096141.79978: variable 'ansible_search_path' from source: unknown 11792 1727096141.80076: calling self._execute() 11792 1727096141.80272: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.80276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.80279: variable 'omit' from source: magic vars 11792 1727096141.80498: variable 'ansible_distribution_major_version' from source: facts 11792 1727096141.80507: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096141.80635: variable 'profile_stat' from source: set_fact 11792 1727096141.80647: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096141.80650: when evaluation is False, skipping this task 11792 1727096141.80656: _execute() done 11792 1727096141.80659: dumping result to json 11792 1727096141.80662: done dumping result, returning 11792 1727096141.80664: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0afff68d-5257-d9c7-3fc0-000000000561] 11792 1727096141.80669: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000561 11792 1727096141.80786: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000561 11792 1727096141.80789: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096141.80854: no more pending results, returning what we have 11792 1727096141.80859: results queue empty 11792 1727096141.80859: checking for any_errors_fatal 11792 1727096141.80869: done checking for any_errors_fatal 11792 1727096141.80870: checking for max_fail_percentage 11792 1727096141.80873: done checking for max_fail_percentage 11792 1727096141.80873: checking to see if all hosts have failed and the running result is not ok 11792 1727096141.80874: done checking to see if all hosts have failed 11792 1727096141.80875: getting the remaining hosts for this loop 11792 1727096141.80876: done getting the remaining hosts for this loop 11792 1727096141.80880: getting the next task for host managed_node2 11792 1727096141.80890: done getting next task for host managed_node2 11792 1727096141.80893: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11792 1727096141.80899: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096141.80904: getting variables 11792 1727096141.80906: in VariableManager get_vars() 11792 1727096141.80940: Calling all_inventory to load vars for managed_node2 11792 1727096141.80943: Calling groups_inventory to load vars for managed_node2 11792 1727096141.80947: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096141.81090: Calling all_plugins_play to load vars for managed_node2 11792 1727096141.81099: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096141.81107: Calling groups_plugins_play to load vars for managed_node2 11792 1727096141.83548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096141.85273: done with get_vars() 11792 1727096141.85307: done getting variables 11792 1727096141.85382: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096141.85631: variable 'profile' from source: include params 11792 1727096141.85635: variable 'bond_port_profile' from source: include params 11792 1727096141.85752: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:55:41 -0400 (0:00:00.072) 0:00:24.139 ****** 11792 1727096141.86006: entering _queue_task() for managed_node2/assert 11792 1727096141.86662: worker is 1 (out of 1 available) 11792 1727096141.86680: exiting _queue_task() for managed_node2/assert 11792 1727096141.86699: done queuing things up, now waiting for results queue to drain 11792 1727096141.86701: waiting for pending results... 11792 1727096141.87492: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' 11792 1727096141.87711: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004e1 11792 1727096141.87715: variable 'ansible_search_path' from source: unknown 11792 1727096141.87718: variable 'ansible_search_path' from source: unknown 11792 1727096141.87722: calling self._execute() 11792 1727096141.87909: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.88147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.88154: variable 'omit' from source: magic vars 11792 1727096141.88770: variable 'ansible_distribution_major_version' from source: facts 11792 1727096141.88880: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096141.88920: variable 'omit' from source: magic vars 11792 1727096141.89063: variable 'omit' from source: magic vars 11792 1727096141.89295: variable 'profile' from source: include params 11792 1727096141.89329: variable 'bond_port_profile' from source: include params 11792 1727096141.89440: variable 'bond_port_profile' from source: include params 11792 1727096141.89519: variable 'omit' from source: magic vars 11792 1727096141.89606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096141.89874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096141.89877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096141.89880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096141.89882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096141.90014: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096141.90017: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.90020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.90362: Set connection var ansible_timeout to 10 11792 1727096141.90365: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096141.90367: Set connection var ansible_shell_executable to /bin/sh 11792 1727096141.90371: Set connection var ansible_pipelining to False 11792 1727096141.90372: Set connection var ansible_shell_type to sh 11792 1727096141.90374: Set connection var ansible_connection to ssh 11792 1727096141.90973: variable 'ansible_shell_executable' from source: unknown 11792 1727096141.90977: variable 'ansible_connection' from source: unknown 11792 1727096141.90979: variable 'ansible_module_compression' from source: unknown 11792 1727096141.90982: variable 'ansible_shell_type' from source: unknown 11792 1727096141.90984: variable 'ansible_shell_executable' from source: unknown 11792 1727096141.90986: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.90987: variable 'ansible_pipelining' from source: unknown 11792 1727096141.90990: variable 'ansible_timeout' from source: unknown 11792 1727096141.90992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.90995: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096141.90997: variable 'omit' from source: magic vars 11792 1727096141.91256: starting attempt loop 11792 1727096141.91267: running the handler 11792 1727096141.91406: variable 'lsr_net_profile_exists' from source: set_fact 11792 1727096141.91583: Evaluated conditional (lsr_net_profile_exists): True 11792 1727096141.91596: handler run complete 11792 1727096141.91618: attempt loop complete, returning result 11792 1727096141.91627: _execute() done 11792 1727096141.91634: dumping result to json 11792 1727096141.91973: done dumping result, returning 11792 1727096141.91976: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' [0afff68d-5257-d9c7-3fc0-0000000004e1] 11792 1727096141.91979: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e1 11792 1727096141.92054: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e1 11792 1727096141.92058: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096141.92118: no more pending results, returning what we have 11792 1727096141.92122: results queue empty 11792 1727096141.92123: checking for any_errors_fatal 11792 1727096141.92132: done checking for any_errors_fatal 11792 1727096141.92133: checking for max_fail_percentage 11792 1727096141.92135: done checking for max_fail_percentage 11792 1727096141.92136: checking to see if all hosts have failed and the running result is not ok 11792 1727096141.92137: done checking to see if all hosts have failed 11792 1727096141.92138: getting the remaining hosts for this loop 11792 1727096141.92139: done getting the remaining hosts for this loop 11792 1727096141.92143: getting the next task for host managed_node2 11792 1727096141.92152: done getting next task for host managed_node2 11792 1727096141.92155: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11792 1727096141.92161: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096141.92167: getting variables 11792 1727096141.92177: in VariableManager get_vars() 11792 1727096141.92215: Calling all_inventory to load vars for managed_node2 11792 1727096141.92218: Calling groups_inventory to load vars for managed_node2 11792 1727096141.92222: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096141.92237: Calling all_plugins_play to load vars for managed_node2 11792 1727096141.92241: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096141.92244: Calling groups_plugins_play to load vars for managed_node2 11792 1727096141.95799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096141.98598: done with get_vars() 11792 1727096141.98630: done getting variables 11792 1727096141.98696: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096141.98823: variable 'profile' from source: include params 11792 1727096141.98828: variable 'bond_port_profile' from source: include params 11792 1727096141.98886: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:55:41 -0400 (0:00:00.129) 0:00:24.268 ****** 11792 1727096141.98924: entering _queue_task() for managed_node2/assert 11792 1727096141.99279: worker is 1 (out of 1 available) 11792 1727096141.99293: exiting _queue_task() for managed_node2/assert 11792 1727096141.99306: done queuing things up, now waiting for results queue to drain 11792 1727096141.99307: waiting for pending results... 11792 1727096141.99731: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11792 1727096141.99737: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004e2 11792 1727096141.99741: variable 'ansible_search_path' from source: unknown 11792 1727096141.99745: variable 'ansible_search_path' from source: unknown 11792 1727096141.99828: calling self._execute() 11792 1727096141.99973: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096141.99977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096141.99980: variable 'omit' from source: magic vars 11792 1727096142.00264: variable 'ansible_distribution_major_version' from source: facts 11792 1727096142.00277: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096142.00283: variable 'omit' from source: magic vars 11792 1727096142.00338: variable 'omit' from source: magic vars 11792 1727096142.00444: variable 'profile' from source: include params 11792 1727096142.00448: variable 'bond_port_profile' from source: include params 11792 1727096142.00519: variable 'bond_port_profile' from source: include params 11792 1727096142.00539: variable 'omit' from source: magic vars 11792 1727096142.00772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096142.00776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096142.00779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096142.00785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096142.00788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096142.00791: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096142.00795: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.00797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.00821: Set connection var ansible_timeout to 10 11792 1727096142.00829: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096142.00838: Set connection var ansible_shell_executable to /bin/sh 11792 1727096142.00843: Set connection var ansible_pipelining to False 11792 1727096142.00846: Set connection var ansible_shell_type to sh 11792 1727096142.00849: Set connection var ansible_connection to ssh 11792 1727096142.00973: variable 'ansible_shell_executable' from source: unknown 11792 1727096142.00976: variable 'ansible_connection' from source: unknown 11792 1727096142.00979: variable 'ansible_module_compression' from source: unknown 11792 1727096142.00981: variable 'ansible_shell_type' from source: unknown 11792 1727096142.00983: variable 'ansible_shell_executable' from source: unknown 11792 1727096142.00985: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.00987: variable 'ansible_pipelining' from source: unknown 11792 1727096142.00989: variable 'ansible_timeout' from source: unknown 11792 1727096142.00991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.01066: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096142.01080: variable 'omit' from source: magic vars 11792 1727096142.01085: starting attempt loop 11792 1727096142.01089: running the handler 11792 1727096142.01272: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11792 1727096142.01275: Evaluated conditional (lsr_net_profile_ansible_managed): True 11792 1727096142.01277: handler run complete 11792 1727096142.01280: attempt loop complete, returning result 11792 1727096142.01282: _execute() done 11792 1727096142.01284: dumping result to json 11792 1727096142.01286: done dumping result, returning 11792 1727096142.01289: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0afff68d-5257-d9c7-3fc0-0000000004e2] 11792 1727096142.01290: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e2 11792 1727096142.01570: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e2 11792 1727096142.01574: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096142.01616: no more pending results, returning what we have 11792 1727096142.01619: results queue empty 11792 1727096142.01620: checking for any_errors_fatal 11792 1727096142.01626: done checking for any_errors_fatal 11792 1727096142.01627: checking for max_fail_percentage 11792 1727096142.01629: done checking for max_fail_percentage 11792 1727096142.01630: checking to see if all hosts have failed and the running result is not ok 11792 1727096142.01631: done checking to see if all hosts have failed 11792 1727096142.01632: getting the remaining hosts for this loop 11792 1727096142.01633: done getting the remaining hosts for this loop 11792 1727096142.01636: getting the next task for host managed_node2 11792 1727096142.01644: done getting next task for host managed_node2 11792 1727096142.01646: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11792 1727096142.01664: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096142.01671: getting variables 11792 1727096142.01673: in VariableManager get_vars() 11792 1727096142.01702: Calling all_inventory to load vars for managed_node2 11792 1727096142.01704: Calling groups_inventory to load vars for managed_node2 11792 1727096142.01707: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096142.01717: Calling all_plugins_play to load vars for managed_node2 11792 1727096142.01719: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096142.01722: Calling groups_plugins_play to load vars for managed_node2 11792 1727096142.03248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096142.05506: done with get_vars() 11792 1727096142.05648: done getting variables 11792 1727096142.05715: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096142.05952: variable 'profile' from source: include params 11792 1727096142.05957: variable 'bond_port_profile' from source: include params 11792 1727096142.06128: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:55:42 -0400 (0:00:00.072) 0:00:24.340 ****** 11792 1727096142.06159: entering _queue_task() for managed_node2/assert 11792 1727096142.06850: worker is 1 (out of 1 available) 11792 1727096142.06864: exiting _queue_task() for managed_node2/assert 11792 1727096142.06880: done queuing things up, now waiting for results queue to drain 11792 1727096142.06882: waiting for pending results... 11792 1727096142.07594: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 11792 1727096142.08163: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004e3 11792 1727096142.08169: variable 'ansible_search_path' from source: unknown 11792 1727096142.08172: variable 'ansible_search_path' from source: unknown 11792 1727096142.08174: calling self._execute() 11792 1727096142.08325: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.08387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.08401: variable 'omit' from source: magic vars 11792 1727096142.09162: variable 'ansible_distribution_major_version' from source: facts 11792 1727096142.09203: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096142.09338: variable 'omit' from source: magic vars 11792 1727096142.09413: variable 'omit' from source: magic vars 11792 1727096142.09975: variable 'profile' from source: include params 11792 1727096142.09978: variable 'bond_port_profile' from source: include params 11792 1727096142.09980: variable 'bond_port_profile' from source: include params 11792 1727096142.09982: variable 'omit' from source: magic vars 11792 1727096142.09983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096142.10326: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096142.10346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096142.10364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096142.10377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096142.10411: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096142.10417: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.10420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.10672: Set connection var ansible_timeout to 10 11792 1727096142.10676: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096142.10678: Set connection var ansible_shell_executable to /bin/sh 11792 1727096142.10681: Set connection var ansible_pipelining to False 11792 1727096142.10683: Set connection var ansible_shell_type to sh 11792 1727096142.10685: Set connection var ansible_connection to ssh 11792 1727096142.10687: variable 'ansible_shell_executable' from source: unknown 11792 1727096142.10689: variable 'ansible_connection' from source: unknown 11792 1727096142.10691: variable 'ansible_module_compression' from source: unknown 11792 1727096142.10693: variable 'ansible_shell_type' from source: unknown 11792 1727096142.10695: variable 'ansible_shell_executable' from source: unknown 11792 1727096142.10697: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.10699: variable 'ansible_pipelining' from source: unknown 11792 1727096142.10701: variable 'ansible_timeout' from source: unknown 11792 1727096142.10703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.10872: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096142.10876: variable 'omit' from source: magic vars 11792 1727096142.10878: starting attempt loop 11792 1727096142.10880: running the handler 11792 1727096142.10897: variable 'lsr_net_profile_fingerprint' from source: set_fact 11792 1727096142.10903: Evaluated conditional (lsr_net_profile_fingerprint): True 11792 1727096142.10910: handler run complete 11792 1727096142.10958: attempt loop complete, returning result 11792 1727096142.10961: _execute() done 11792 1727096142.10963: dumping result to json 11792 1727096142.10966: done dumping result, returning 11792 1727096142.10969: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 [0afff68d-5257-d9c7-3fc0-0000000004e3] 11792 1727096142.10972: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e3 11792 1727096142.11044: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e3 11792 1727096142.11046: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096142.11108: no more pending results, returning what we have 11792 1727096142.11112: results queue empty 11792 1727096142.11113: checking for any_errors_fatal 11792 1727096142.11121: done checking for any_errors_fatal 11792 1727096142.11122: checking for max_fail_percentage 11792 1727096142.11124: done checking for max_fail_percentage 11792 1727096142.11125: checking to see if all hosts have failed and the running result is not ok 11792 1727096142.11125: done checking to see if all hosts have failed 11792 1727096142.11126: getting the remaining hosts for this loop 11792 1727096142.11128: done getting the remaining hosts for this loop 11792 1727096142.11131: getting the next task for host managed_node2 11792 1727096142.11142: done getting next task for host managed_node2 11792 1727096142.11145: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11792 1727096142.11149: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096142.11157: getting variables 11792 1727096142.11158: in VariableManager get_vars() 11792 1727096142.11201: Calling all_inventory to load vars for managed_node2 11792 1727096142.11204: Calling groups_inventory to load vars for managed_node2 11792 1727096142.11208: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096142.11220: Calling all_plugins_play to load vars for managed_node2 11792 1727096142.11223: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096142.11226: Calling groups_plugins_play to load vars for managed_node2 11792 1727096142.12810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096142.15077: done with get_vars() 11792 1727096142.15110: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:55:42 -0400 (0:00:00.091) 0:00:24.432 ****** 11792 1727096142.15339: entering _queue_task() for managed_node2/include_tasks 11792 1727096142.15913: worker is 1 (out of 1 available) 11792 1727096142.16040: exiting _queue_task() for managed_node2/include_tasks 11792 1727096142.16052: done queuing things up, now waiting for results queue to drain 11792 1727096142.16053: waiting for pending results... 11792 1727096142.16270: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11792 1727096142.16574: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004e7 11792 1727096142.16578: variable 'ansible_search_path' from source: unknown 11792 1727096142.16580: variable 'ansible_search_path' from source: unknown 11792 1727096142.16583: calling self._execute() 11792 1727096142.16585: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.16588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.16590: variable 'omit' from source: magic vars 11792 1727096142.16959: variable 'ansible_distribution_major_version' from source: facts 11792 1727096142.16972: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096142.16978: _execute() done 11792 1727096142.16982: dumping result to json 11792 1727096142.16985: done dumping result, returning 11792 1727096142.16992: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-d9c7-3fc0-0000000004e7] 11792 1727096142.16995: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e7 11792 1727096142.17102: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e7 11792 1727096142.17105: WORKER PROCESS EXITING 11792 1727096142.17142: no more pending results, returning what we have 11792 1727096142.17148: in VariableManager get_vars() 11792 1727096142.17189: Calling all_inventory to load vars for managed_node2 11792 1727096142.17192: Calling groups_inventory to load vars for managed_node2 11792 1727096142.17196: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096142.17211: Calling all_plugins_play to load vars for managed_node2 11792 1727096142.17215: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096142.17218: Calling groups_plugins_play to load vars for managed_node2 11792 1727096142.20534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096142.23922: done with get_vars() 11792 1727096142.23947: variable 'ansible_search_path' from source: unknown 11792 1727096142.23948: variable 'ansible_search_path' from source: unknown 11792 1727096142.23988: we have included files to process 11792 1727096142.23990: generating all_blocks data 11792 1727096142.23992: done generating all_blocks data 11792 1727096142.23997: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096142.23998: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096142.24002: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11792 1727096142.26087: done processing included file 11792 1727096142.26089: iterating over new_blocks loaded from include file 11792 1727096142.26091: in VariableManager get_vars() 11792 1727096142.26108: done with get_vars() 11792 1727096142.26110: filtering new block on tags 11792 1727096142.26305: done filtering new block on tags 11792 1727096142.26309: in VariableManager get_vars() 11792 1727096142.26324: done with get_vars() 11792 1727096142.26325: filtering new block on tags 11792 1727096142.26511: done filtering new block on tags 11792 1727096142.26514: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11792 1727096142.26520: extending task lists for all hosts with included blocks 11792 1727096142.27430: done extending task lists 11792 1727096142.27432: done processing included files 11792 1727096142.27433: results queue empty 11792 1727096142.27433: checking for any_errors_fatal 11792 1727096142.27437: done checking for any_errors_fatal 11792 1727096142.27437: checking for max_fail_percentage 11792 1727096142.27438: done checking for max_fail_percentage 11792 1727096142.27439: checking to see if all hosts have failed and the running result is not ok 11792 1727096142.27439: done checking to see if all hosts have failed 11792 1727096142.27440: getting the remaining hosts for this loop 11792 1727096142.27441: done getting the remaining hosts for this loop 11792 1727096142.27443: getting the next task for host managed_node2 11792 1727096142.27448: done getting next task for host managed_node2 11792 1727096142.27450: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11792 1727096142.27453: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096142.27455: getting variables 11792 1727096142.27456: in VariableManager get_vars() 11792 1727096142.27465: Calling all_inventory to load vars for managed_node2 11792 1727096142.27469: Calling groups_inventory to load vars for managed_node2 11792 1727096142.27471: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096142.27477: Calling all_plugins_play to load vars for managed_node2 11792 1727096142.27479: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096142.27482: Calling groups_plugins_play to load vars for managed_node2 11792 1727096142.28890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096142.30396: done with get_vars() 11792 1727096142.30420: done getting variables 11792 1727096142.30470: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:55:42 -0400 (0:00:00.151) 0:00:24.584 ****** 11792 1727096142.30501: entering _queue_task() for managed_node2/set_fact 11792 1727096142.30842: worker is 1 (out of 1 available) 11792 1727096142.30857: exiting _queue_task() for managed_node2/set_fact 11792 1727096142.30873: done queuing things up, now waiting for results queue to drain 11792 1727096142.30874: waiting for pending results... 11792 1727096142.31291: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11792 1727096142.31296: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005b4 11792 1727096142.31317: variable 'ansible_search_path' from source: unknown 11792 1727096142.31323: variable 'ansible_search_path' from source: unknown 11792 1727096142.31366: calling self._execute() 11792 1727096142.31473: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.31486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.31505: variable 'omit' from source: magic vars 11792 1727096142.32148: variable 'ansible_distribution_major_version' from source: facts 11792 1727096142.32153: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096142.32156: variable 'omit' from source: magic vars 11792 1727096142.32277: variable 'omit' from source: magic vars 11792 1727096142.32280: variable 'omit' from source: magic vars 11792 1727096142.32390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096142.32429: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096142.32458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096142.32674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096142.32677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096142.32680: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096142.32682: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.32684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.32849: Set connection var ansible_timeout to 10 11792 1727096142.32869: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096142.32904: Set connection var ansible_shell_executable to /bin/sh 11792 1727096142.32981: Set connection var ansible_pipelining to False 11792 1727096142.32989: Set connection var ansible_shell_type to sh 11792 1727096142.32999: Set connection var ansible_connection to ssh 11792 1727096142.33023: variable 'ansible_shell_executable' from source: unknown 11792 1727096142.33033: variable 'ansible_connection' from source: unknown 11792 1727096142.33041: variable 'ansible_module_compression' from source: unknown 11792 1727096142.33047: variable 'ansible_shell_type' from source: unknown 11792 1727096142.33056: variable 'ansible_shell_executable' from source: unknown 11792 1727096142.33062: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.33110: variable 'ansible_pipelining' from source: unknown 11792 1727096142.33120: variable 'ansible_timeout' from source: unknown 11792 1727096142.33128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.33389: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096142.33446: variable 'omit' from source: magic vars 11792 1727096142.33461: starting attempt loop 11792 1727096142.33479: running the handler 11792 1727096142.33492: handler run complete 11792 1727096142.33576: attempt loop complete, returning result 11792 1727096142.33579: _execute() done 11792 1727096142.33582: dumping result to json 11792 1727096142.33584: done dumping result, returning 11792 1727096142.33593: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-d9c7-3fc0-0000000005b4] 11792 1727096142.33595: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b4 11792 1727096142.33808: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b4 11792 1727096142.33811: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11792 1727096142.33865: no more pending results, returning what we have 11792 1727096142.33871: results queue empty 11792 1727096142.33872: checking for any_errors_fatal 11792 1727096142.33874: done checking for any_errors_fatal 11792 1727096142.33874: checking for max_fail_percentage 11792 1727096142.33876: done checking for max_fail_percentage 11792 1727096142.33877: checking to see if all hosts have failed and the running result is not ok 11792 1727096142.33878: done checking to see if all hosts have failed 11792 1727096142.33879: getting the remaining hosts for this loop 11792 1727096142.33881: done getting the remaining hosts for this loop 11792 1727096142.33885: getting the next task for host managed_node2 11792 1727096142.33893: done getting next task for host managed_node2 11792 1727096142.33896: ^ task is: TASK: Stat profile file 11792 1727096142.33901: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096142.33905: getting variables 11792 1727096142.33907: in VariableManager get_vars() 11792 1727096142.33939: Calling all_inventory to load vars for managed_node2 11792 1727096142.33941: Calling groups_inventory to load vars for managed_node2 11792 1727096142.33944: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096142.33958: Calling all_plugins_play to load vars for managed_node2 11792 1727096142.33961: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096142.33963: Calling groups_plugins_play to load vars for managed_node2 11792 1727096142.36976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096142.40007: done with get_vars() 11792 1727096142.40037: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:55:42 -0400 (0:00:00.098) 0:00:24.682 ****** 11792 1727096142.40357: entering _queue_task() for managed_node2/stat 11792 1727096142.41306: worker is 1 (out of 1 available) 11792 1727096142.41317: exiting _queue_task() for managed_node2/stat 11792 1727096142.41328: done queuing things up, now waiting for results queue to drain 11792 1727096142.41330: waiting for pending results... 11792 1727096142.41498: running TaskExecutor() for managed_node2/TASK: Stat profile file 11792 1727096142.41775: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005b5 11792 1727096142.41779: variable 'ansible_search_path' from source: unknown 11792 1727096142.41782: variable 'ansible_search_path' from source: unknown 11792 1727096142.41785: calling self._execute() 11792 1727096142.41857: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.41875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.41892: variable 'omit' from source: magic vars 11792 1727096142.42308: variable 'ansible_distribution_major_version' from source: facts 11792 1727096142.42335: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096142.42573: variable 'omit' from source: magic vars 11792 1727096142.42577: variable 'omit' from source: magic vars 11792 1727096142.42579: variable 'profile' from source: include params 11792 1727096142.42581: variable 'bond_port_profile' from source: include params 11792 1727096142.42596: variable 'bond_port_profile' from source: include params 11792 1727096142.42622: variable 'omit' from source: magic vars 11792 1727096142.42675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096142.42724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096142.42750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096142.42777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096142.42793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096142.42831: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096142.42840: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.42847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.42964: Set connection var ansible_timeout to 10 11792 1727096142.42984: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096142.43032: Set connection var ansible_shell_executable to /bin/sh 11792 1727096142.43044: Set connection var ansible_pipelining to False 11792 1727096142.43050: Set connection var ansible_shell_type to sh 11792 1727096142.43061: Set connection var ansible_connection to ssh 11792 1727096142.43090: variable 'ansible_shell_executable' from source: unknown 11792 1727096142.43098: variable 'ansible_connection' from source: unknown 11792 1727096142.43105: variable 'ansible_module_compression' from source: unknown 11792 1727096142.43111: variable 'ansible_shell_type' from source: unknown 11792 1727096142.43116: variable 'ansible_shell_executable' from source: unknown 11792 1727096142.43122: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.43135: variable 'ansible_pipelining' from source: unknown 11792 1727096142.43141: variable 'ansible_timeout' from source: unknown 11792 1727096142.43148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.43674: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096142.43678: variable 'omit' from source: magic vars 11792 1727096142.43681: starting attempt loop 11792 1727096142.43683: running the handler 11792 1727096142.43685: _low_level_execute_command(): starting 11792 1727096142.43688: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096142.44565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096142.44671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096142.44689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096142.44965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096142.45085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096142.46846: stdout chunk (state=3): >>>/root <<< 11792 1727096142.46911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096142.46923: stdout chunk (state=3): >>><<< 11792 1727096142.46935: stderr chunk (state=3): >>><<< 11792 1727096142.47170: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096142.47174: _low_level_execute_command(): starting 11792 1727096142.47177: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633 `" && echo ansible-tmp-1727096142.4708507-12936-47962382284633="` echo /root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633 `" ) && sleep 0' 11792 1727096142.48434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096142.48778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096142.48861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096142.48926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096142.50886: stdout chunk (state=3): >>>ansible-tmp-1727096142.4708507-12936-47962382284633=/root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633 <<< 11792 1727096142.51033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096142.51036: stdout chunk (state=3): >>><<< 11792 1727096142.51173: stderr chunk (state=3): >>><<< 11792 1727096142.51177: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096142.4708507-12936-47962382284633=/root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096142.51180: variable 'ansible_module_compression' from source: unknown 11792 1727096142.51183: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11792 1727096142.51316: variable 'ansible_facts' from source: unknown 11792 1727096142.51575: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/AnsiballZ_stat.py 11792 1727096142.51889: Sending initial data 11792 1727096142.51892: Sent initial data (152 bytes) 11792 1727096142.53086: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096142.53099: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096142.53272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096142.53320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096142.53343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096142.53400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096142.55027: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096142.55049: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096142.55094: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpeqccj8iq /root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/AnsiballZ_stat.py <<< 11792 1727096142.55098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/AnsiballZ_stat.py" <<< 11792 1727096142.55146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpeqccj8iq" to remote "/root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/AnsiballZ_stat.py" <<< 11792 1727096142.57047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096142.57051: stdout chunk (state=3): >>><<< 11792 1727096142.57056: stderr chunk (state=3): >>><<< 11792 1727096142.57112: done transferring module to remote 11792 1727096142.57121: _low_level_execute_command(): starting 11792 1727096142.57126: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/ /root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/AnsiballZ_stat.py && sleep 0' 11792 1727096142.58403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096142.58407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096142.58409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096142.58411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096142.58413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096142.58417: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096142.58419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096142.58421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096142.58423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096142.58425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096142.58427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096142.58429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096142.58431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096142.58433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096142.58435: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096142.58436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096142.58442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096142.58444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096142.58446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096142.58503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096142.60457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096142.60580: stdout chunk (state=3): >>><<< 11792 1727096142.60583: stderr chunk (state=3): >>><<< 11792 1727096142.60588: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096142.60591: _low_level_execute_command(): starting 11792 1727096142.60599: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/AnsiballZ_stat.py && sleep 0' 11792 1727096142.61869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096142.61889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096142.61985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096142.62186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096142.62223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096142.62316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096142.78050: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11792 1727096142.79517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096142.79592: stdout chunk (state=3): >>><<< 11792 1727096142.79595: stderr chunk (state=3): >>><<< 11792 1727096142.79776: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096142.79780: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096142.79783: _low_level_execute_command(): starting 11792 1727096142.79786: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096142.4708507-12936-47962382284633/ > /dev/null 2>&1 && sleep 0' 11792 1727096142.80845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096142.80861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096142.80882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096142.81046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096142.81182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096142.81387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096142.83285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096142.83295: stdout chunk (state=3): >>><<< 11792 1727096142.83338: stderr chunk (state=3): >>><<< 11792 1727096142.83359: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096142.83406: handler run complete 11792 1727096142.83430: attempt loop complete, returning result 11792 1727096142.83662: _execute() done 11792 1727096142.83665: dumping result to json 11792 1727096142.83670: done dumping result, returning 11792 1727096142.83672: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0afff68d-5257-d9c7-3fc0-0000000005b5] 11792 1727096142.83674: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b5 11792 1727096142.83739: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b5 11792 1727096142.83742: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11792 1727096142.83823: no more pending results, returning what we have 11792 1727096142.83827: results queue empty 11792 1727096142.83828: checking for any_errors_fatal 11792 1727096142.83837: done checking for any_errors_fatal 11792 1727096142.83838: checking for max_fail_percentage 11792 1727096142.83839: done checking for max_fail_percentage 11792 1727096142.83840: checking to see if all hosts have failed and the running result is not ok 11792 1727096142.83841: done checking to see if all hosts have failed 11792 1727096142.83841: getting the remaining hosts for this loop 11792 1727096142.83843: done getting the remaining hosts for this loop 11792 1727096142.83846: getting the next task for host managed_node2 11792 1727096142.83854: done getting next task for host managed_node2 11792 1727096142.83857: ^ task is: TASK: Set NM profile exist flag based on the profile files 11792 1727096142.83862: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096142.83869: getting variables 11792 1727096142.83870: in VariableManager get_vars() 11792 1727096142.83904: Calling all_inventory to load vars for managed_node2 11792 1727096142.83907: Calling groups_inventory to load vars for managed_node2 11792 1727096142.83911: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096142.83923: Calling all_plugins_play to load vars for managed_node2 11792 1727096142.83926: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096142.83929: Calling groups_plugins_play to load vars for managed_node2 11792 1727096142.87111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096142.91417: done with get_vars() 11792 1727096142.91443: done getting variables 11792 1727096142.91509: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:55:42 -0400 (0:00:00.511) 0:00:25.194 ****** 11792 1727096142.91544: entering _queue_task() for managed_node2/set_fact 11792 1727096142.92306: worker is 1 (out of 1 available) 11792 1727096142.92318: exiting _queue_task() for managed_node2/set_fact 11792 1727096142.92332: done queuing things up, now waiting for results queue to drain 11792 1727096142.92334: waiting for pending results... 11792 1727096142.92805: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11792 1727096142.93090: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005b6 11792 1727096142.93105: variable 'ansible_search_path' from source: unknown 11792 1727096142.93109: variable 'ansible_search_path' from source: unknown 11792 1727096142.93146: calling self._execute() 11792 1727096142.93445: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096142.93449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096142.93555: variable 'omit' from source: magic vars 11792 1727096142.94231: variable 'ansible_distribution_major_version' from source: facts 11792 1727096142.94243: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096142.94463: variable 'profile_stat' from source: set_fact 11792 1727096142.94475: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096142.94478: when evaluation is False, skipping this task 11792 1727096142.94481: _execute() done 11792 1727096142.94484: dumping result to json 11792 1727096142.94602: done dumping result, returning 11792 1727096142.94644: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-d9c7-3fc0-0000000005b6] 11792 1727096142.94647: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b6 11792 1727096142.94718: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b6 11792 1727096142.94721: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096142.94798: no more pending results, returning what we have 11792 1727096142.94802: results queue empty 11792 1727096142.94803: checking for any_errors_fatal 11792 1727096142.94811: done checking for any_errors_fatal 11792 1727096142.94812: checking for max_fail_percentage 11792 1727096142.94814: done checking for max_fail_percentage 11792 1727096142.94815: checking to see if all hosts have failed and the running result is not ok 11792 1727096142.94815: done checking to see if all hosts have failed 11792 1727096142.94816: getting the remaining hosts for this loop 11792 1727096142.94817: done getting the remaining hosts for this loop 11792 1727096142.94821: getting the next task for host managed_node2 11792 1727096142.94829: done getting next task for host managed_node2 11792 1727096142.94831: ^ task is: TASK: Get NM profile info 11792 1727096142.94836: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096142.94841: getting variables 11792 1727096142.94842: in VariableManager get_vars() 11792 1727096142.94875: Calling all_inventory to load vars for managed_node2 11792 1727096142.94878: Calling groups_inventory to load vars for managed_node2 11792 1727096142.94881: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096142.94894: Calling all_plugins_play to load vars for managed_node2 11792 1727096142.94896: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096142.94898: Calling groups_plugins_play to load vars for managed_node2 11792 1727096143.08904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096143.11212: done with get_vars() 11792 1727096143.11245: done getting variables 11792 1727096143.11308: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:55:43 -0400 (0:00:00.197) 0:00:25.392 ****** 11792 1727096143.11340: entering _queue_task() for managed_node2/shell 11792 1727096143.11896: worker is 1 (out of 1 available) 11792 1727096143.11906: exiting _queue_task() for managed_node2/shell 11792 1727096143.11917: done queuing things up, now waiting for results queue to drain 11792 1727096143.11919: waiting for pending results... 11792 1727096143.12261: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11792 1727096143.12268: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005b7 11792 1727096143.12273: variable 'ansible_search_path' from source: unknown 11792 1727096143.12277: variable 'ansible_search_path' from source: unknown 11792 1727096143.12382: calling self._execute() 11792 1727096143.12407: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.12414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.12422: variable 'omit' from source: magic vars 11792 1727096143.12901: variable 'ansible_distribution_major_version' from source: facts 11792 1727096143.12904: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096143.12907: variable 'omit' from source: magic vars 11792 1727096143.12937: variable 'omit' from source: magic vars 11792 1727096143.13040: variable 'profile' from source: include params 11792 1727096143.13118: variable 'bond_port_profile' from source: include params 11792 1727096143.13122: variable 'bond_port_profile' from source: include params 11792 1727096143.13226: variable 'omit' from source: magic vars 11792 1727096143.13231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096143.13234: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096143.13244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096143.13265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096143.13284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096143.13315: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096143.13319: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.13322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.13443: Set connection var ansible_timeout to 10 11792 1727096143.13553: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096143.13556: Set connection var ansible_shell_executable to /bin/sh 11792 1727096143.13558: Set connection var ansible_pipelining to False 11792 1727096143.13561: Set connection var ansible_shell_type to sh 11792 1727096143.13563: Set connection var ansible_connection to ssh 11792 1727096143.13566: variable 'ansible_shell_executable' from source: unknown 11792 1727096143.13570: variable 'ansible_connection' from source: unknown 11792 1727096143.13573: variable 'ansible_module_compression' from source: unknown 11792 1727096143.13575: variable 'ansible_shell_type' from source: unknown 11792 1727096143.13577: variable 'ansible_shell_executable' from source: unknown 11792 1727096143.13579: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.13581: variable 'ansible_pipelining' from source: unknown 11792 1727096143.13583: variable 'ansible_timeout' from source: unknown 11792 1727096143.13586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.13683: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096143.13696: variable 'omit' from source: magic vars 11792 1727096143.13703: starting attempt loop 11792 1727096143.13712: running the handler 11792 1727096143.13723: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096143.13976: _low_level_execute_command(): starting 11792 1727096143.13980: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096143.14596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096143.14660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096143.14712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096143.14737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096143.16589: stdout chunk (state=3): >>>/root <<< 11792 1727096143.16680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096143.16686: stdout chunk (state=3): >>><<< 11792 1727096143.16698: stderr chunk (state=3): >>><<< 11792 1727096143.16766: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096143.16773: _low_level_execute_command(): starting 11792 1727096143.16776: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309 `" && echo ansible-tmp-1727096143.1672628-12977-277344056782309="` echo /root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309 `" ) && sleep 0' 11792 1727096143.18120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096143.18135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096143.18184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096143.18348: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096143.18389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096143.18408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096143.18426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096143.18679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096143.20596: stdout chunk (state=3): >>>ansible-tmp-1727096143.1672628-12977-277344056782309=/root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309 <<< 11792 1727096143.20976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096143.20980: stdout chunk (state=3): >>><<< 11792 1727096143.20982: stderr chunk (state=3): >>><<< 11792 1727096143.20985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096143.1672628-12977-277344056782309=/root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096143.20987: variable 'ansible_module_compression' from source: unknown 11792 1727096143.20989: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096143.20990: variable 'ansible_facts' from source: unknown 11792 1727096143.21175: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/AnsiballZ_command.py 11792 1727096143.21406: Sending initial data 11792 1727096143.21416: Sent initial data (156 bytes) 11792 1727096143.22663: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096143.22785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096143.22844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096143.24457: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096143.24598: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp8xhl3qfq" to remote "/root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/AnsiballZ_command.py" <<< 11792 1727096143.24603: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp8xhl3qfq /root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/AnsiballZ_command.py <<< 11792 1727096143.25918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096143.25934: stderr chunk (state=3): >>><<< 11792 1727096143.25963: stdout chunk (state=3): >>><<< 11792 1727096143.26169: done transferring module to remote 11792 1727096143.26173: _low_level_execute_command(): starting 11792 1727096143.26176: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/ /root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/AnsiballZ_command.py && sleep 0' 11792 1727096143.27373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096143.27402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096143.27406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096143.27412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096143.27611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096143.27633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096143.29536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096143.29591: stderr chunk (state=3): >>><<< 11792 1727096143.29600: stdout chunk (state=3): >>><<< 11792 1727096143.29887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096143.29890: _low_level_execute_command(): starting 11792 1727096143.29893: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/AnsiballZ_command.py && sleep 0' 11792 1727096143.31528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096143.31579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096143.31593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096143.31612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096143.31757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096143.49390: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-23 08:55:43.471316", "end": "2024-09-23 08:55:43.492439", "delta": "0:00:00.021123", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096143.51146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096143.51151: stdout chunk (state=3): >>><<< 11792 1727096143.51158: stderr chunk (state=3): >>><<< 11792 1727096143.51180: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-23 08:55:43.471316", "end": "2024-09-23 08:55:43.492439", "delta": "0:00:00.021123", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096143.51215: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096143.51225: _low_level_execute_command(): starting 11792 1727096143.51228: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096143.1672628-12977-277344056782309/ > /dev/null 2>&1 && sleep 0' 11792 1727096143.52549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096143.52704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096143.52709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096143.52801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096143.52869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096143.54689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096143.54875: stderr chunk (state=3): >>><<< 11792 1727096143.54878: stdout chunk (state=3): >>><<< 11792 1727096143.54881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096143.54884: handler run complete 11792 1727096143.55176: Evaluated conditional (False): False 11792 1727096143.55179: attempt loop complete, returning result 11792 1727096143.55181: _execute() done 11792 1727096143.55183: dumping result to json 11792 1727096143.55185: done dumping result, returning 11792 1727096143.55187: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0afff68d-5257-d9c7-3fc0-0000000005b7] 11792 1727096143.55189: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b7 ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.021123", "end": "2024-09-23 08:55:43.492439", "rc": 0, "start": "2024-09-23 08:55:43.471316" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11792 1727096143.55320: no more pending results, returning what we have 11792 1727096143.55324: results queue empty 11792 1727096143.55325: checking for any_errors_fatal 11792 1727096143.55334: done checking for any_errors_fatal 11792 1727096143.55334: checking for max_fail_percentage 11792 1727096143.55337: done checking for max_fail_percentage 11792 1727096143.55338: checking to see if all hosts have failed and the running result is not ok 11792 1727096143.55338: done checking to see if all hosts have failed 11792 1727096143.55339: getting the remaining hosts for this loop 11792 1727096143.55340: done getting the remaining hosts for this loop 11792 1727096143.55343: getting the next task for host managed_node2 11792 1727096143.55571: done getting next task for host managed_node2 11792 1727096143.55574: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11792 1727096143.55581: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096143.55585: getting variables 11792 1727096143.55587: in VariableManager get_vars() 11792 1727096143.55619: Calling all_inventory to load vars for managed_node2 11792 1727096143.55622: Calling groups_inventory to load vars for managed_node2 11792 1727096143.55625: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096143.55638: Calling all_plugins_play to load vars for managed_node2 11792 1727096143.55641: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096143.55644: Calling groups_plugins_play to load vars for managed_node2 11792 1727096143.56308: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b7 11792 1727096143.56838: WORKER PROCESS EXITING 11792 1727096143.58315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096143.62471: done with get_vars() 11792 1727096143.62698: done getting variables 11792 1727096143.62765: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:55:43 -0400 (0:00:00.514) 0:00:25.907 ****** 11792 1727096143.62804: entering _queue_task() for managed_node2/set_fact 11792 1727096143.63657: worker is 1 (out of 1 available) 11792 1727096143.63871: exiting _queue_task() for managed_node2/set_fact 11792 1727096143.63883: done queuing things up, now waiting for results queue to drain 11792 1727096143.63884: waiting for pending results... 11792 1727096143.64218: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11792 1727096143.64495: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005b8 11792 1727096143.64590: variable 'ansible_search_path' from source: unknown 11792 1727096143.64597: variable 'ansible_search_path' from source: unknown 11792 1727096143.64644: calling self._execute() 11792 1727096143.64825: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.65060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.65063: variable 'omit' from source: magic vars 11792 1727096143.65672: variable 'ansible_distribution_major_version' from source: facts 11792 1727096143.65698: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096143.66017: variable 'nm_profile_exists' from source: set_fact 11792 1727096143.66051: Evaluated conditional (nm_profile_exists.rc == 0): True 11792 1727096143.66255: variable 'omit' from source: magic vars 11792 1727096143.66259: variable 'omit' from source: magic vars 11792 1727096143.66261: variable 'omit' from source: magic vars 11792 1727096143.66676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096143.66680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096143.66682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096143.66685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096143.66687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096143.66747: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096143.66759: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.66769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.66887: Set connection var ansible_timeout to 10 11792 1727096143.67322: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096143.67872: Set connection var ansible_shell_executable to /bin/sh 11792 1727096143.67876: Set connection var ansible_pipelining to False 11792 1727096143.67878: Set connection var ansible_shell_type to sh 11792 1727096143.67880: Set connection var ansible_connection to ssh 11792 1727096143.67883: variable 'ansible_shell_executable' from source: unknown 11792 1727096143.67885: variable 'ansible_connection' from source: unknown 11792 1727096143.67887: variable 'ansible_module_compression' from source: unknown 11792 1727096143.67889: variable 'ansible_shell_type' from source: unknown 11792 1727096143.67891: variable 'ansible_shell_executable' from source: unknown 11792 1727096143.67893: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.67895: variable 'ansible_pipelining' from source: unknown 11792 1727096143.67900: variable 'ansible_timeout' from source: unknown 11792 1727096143.67903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.68020: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096143.68040: variable 'omit' from source: magic vars 11792 1727096143.68090: starting attempt loop 11792 1727096143.68278: running the handler 11792 1727096143.68285: handler run complete 11792 1727096143.68302: attempt loop complete, returning result 11792 1727096143.68309: _execute() done 11792 1727096143.68315: dumping result to json 11792 1727096143.68323: done dumping result, returning 11792 1727096143.68343: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-d9c7-3fc0-0000000005b8] 11792 1727096143.68358: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b8 11792 1727096143.68638: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005b8 11792 1727096143.68641: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11792 1727096143.68703: no more pending results, returning what we have 11792 1727096143.68707: results queue empty 11792 1727096143.68707: checking for any_errors_fatal 11792 1727096143.68716: done checking for any_errors_fatal 11792 1727096143.68717: checking for max_fail_percentage 11792 1727096143.68719: done checking for max_fail_percentage 11792 1727096143.68719: checking to see if all hosts have failed and the running result is not ok 11792 1727096143.68720: done checking to see if all hosts have failed 11792 1727096143.68721: getting the remaining hosts for this loop 11792 1727096143.68722: done getting the remaining hosts for this loop 11792 1727096143.68726: getting the next task for host managed_node2 11792 1727096143.68736: done getting next task for host managed_node2 11792 1727096143.68738: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11792 1727096143.68744: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096143.68748: getting variables 11792 1727096143.68749: in VariableManager get_vars() 11792 1727096143.68787: Calling all_inventory to load vars for managed_node2 11792 1727096143.68789: Calling groups_inventory to load vars for managed_node2 11792 1727096143.68907: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096143.68917: Calling all_plugins_play to load vars for managed_node2 11792 1727096143.68920: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096143.68924: Calling groups_plugins_play to load vars for managed_node2 11792 1727096143.71790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096143.75156: done with get_vars() 11792 1727096143.75308: done getting variables 11792 1727096143.75473: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096143.75745: variable 'profile' from source: include params 11792 1727096143.75750: variable 'bond_port_profile' from source: include params 11792 1727096143.75873: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:55:43 -0400 (0:00:00.131) 0:00:26.038 ****** 11792 1727096143.75909: entering _queue_task() for managed_node2/command 11792 1727096143.76716: worker is 1 (out of 1 available) 11792 1727096143.76730: exiting _queue_task() for managed_node2/command 11792 1727096143.76742: done queuing things up, now waiting for results queue to drain 11792 1727096143.76743: waiting for pending results... 11792 1727096143.77420: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11792 1727096143.77631: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005ba 11792 1727096143.77635: variable 'ansible_search_path' from source: unknown 11792 1727096143.77638: variable 'ansible_search_path' from source: unknown 11792 1727096143.77745: calling self._execute() 11792 1727096143.77928: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.77947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.77985: variable 'omit' from source: magic vars 11792 1727096143.78887: variable 'ansible_distribution_major_version' from source: facts 11792 1727096143.78959: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096143.79121: variable 'profile_stat' from source: set_fact 11792 1727096143.79156: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096143.79180: when evaluation is False, skipping this task 11792 1727096143.79234: _execute() done 11792 1727096143.79243: dumping result to json 11792 1727096143.79251: done dumping result, returning 11792 1727096143.79263: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0afff68d-5257-d9c7-3fc0-0000000005ba] 11792 1727096143.79277: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005ba 11792 1727096143.79543: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005ba 11792 1727096143.79548: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096143.79603: no more pending results, returning what we have 11792 1727096143.79607: results queue empty 11792 1727096143.79608: checking for any_errors_fatal 11792 1727096143.79615: done checking for any_errors_fatal 11792 1727096143.79615: checking for max_fail_percentage 11792 1727096143.79617: done checking for max_fail_percentage 11792 1727096143.79618: checking to see if all hosts have failed and the running result is not ok 11792 1727096143.79619: done checking to see if all hosts have failed 11792 1727096143.79619: getting the remaining hosts for this loop 11792 1727096143.79621: done getting the remaining hosts for this loop 11792 1727096143.79624: getting the next task for host managed_node2 11792 1727096143.79871: done getting next task for host managed_node2 11792 1727096143.79875: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11792 1727096143.79882: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096143.79887: getting variables 11792 1727096143.79889: in VariableManager get_vars() 11792 1727096143.79923: Calling all_inventory to load vars for managed_node2 11792 1727096143.79925: Calling groups_inventory to load vars for managed_node2 11792 1727096143.79929: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096143.79941: Calling all_plugins_play to load vars for managed_node2 11792 1727096143.79944: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096143.79946: Calling groups_plugins_play to load vars for managed_node2 11792 1727096143.83127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096143.86392: done with get_vars() 11792 1727096143.86425: done getting variables 11792 1727096143.86537: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096143.86674: variable 'profile' from source: include params 11792 1727096143.86678: variable 'bond_port_profile' from source: include params 11792 1727096143.86742: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:55:43 -0400 (0:00:00.108) 0:00:26.147 ****** 11792 1727096143.86782: entering _queue_task() for managed_node2/set_fact 11792 1727096143.87151: worker is 1 (out of 1 available) 11792 1727096143.87165: exiting _queue_task() for managed_node2/set_fact 11792 1727096143.87181: done queuing things up, now waiting for results queue to drain 11792 1727096143.87183: waiting for pending results... 11792 1727096143.87810: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11792 1727096143.87815: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005bb 11792 1727096143.87818: variable 'ansible_search_path' from source: unknown 11792 1727096143.87821: variable 'ansible_search_path' from source: unknown 11792 1727096143.87824: calling self._execute() 11792 1727096143.87828: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.87831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.87834: variable 'omit' from source: magic vars 11792 1727096143.88333: variable 'ansible_distribution_major_version' from source: facts 11792 1727096143.88343: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096143.88773: variable 'profile_stat' from source: set_fact 11792 1727096143.88776: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096143.88778: when evaluation is False, skipping this task 11792 1727096143.88781: _execute() done 11792 1727096143.88783: dumping result to json 11792 1727096143.88784: done dumping result, returning 11792 1727096143.88786: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0afff68d-5257-d9c7-3fc0-0000000005bb] 11792 1727096143.88789: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005bb 11792 1727096143.88856: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005bb 11792 1727096143.88859: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096143.88901: no more pending results, returning what we have 11792 1727096143.88905: results queue empty 11792 1727096143.88906: checking for any_errors_fatal 11792 1727096143.88911: done checking for any_errors_fatal 11792 1727096143.88912: checking for max_fail_percentage 11792 1727096143.88914: done checking for max_fail_percentage 11792 1727096143.88915: checking to see if all hosts have failed and the running result is not ok 11792 1727096143.88915: done checking to see if all hosts have failed 11792 1727096143.88916: getting the remaining hosts for this loop 11792 1727096143.88917: done getting the remaining hosts for this loop 11792 1727096143.88920: getting the next task for host managed_node2 11792 1727096143.88927: done getting next task for host managed_node2 11792 1727096143.88930: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11792 1727096143.88935: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096143.88939: getting variables 11792 1727096143.88941: in VariableManager get_vars() 11792 1727096143.89018: Calling all_inventory to load vars for managed_node2 11792 1727096143.89021: Calling groups_inventory to load vars for managed_node2 11792 1727096143.89024: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096143.89035: Calling all_plugins_play to load vars for managed_node2 11792 1727096143.89038: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096143.89040: Calling groups_plugins_play to load vars for managed_node2 11792 1727096143.90602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096143.92299: done with get_vars() 11792 1727096143.92327: done getting variables 11792 1727096143.92398: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096143.92523: variable 'profile' from source: include params 11792 1727096143.92527: variable 'bond_port_profile' from source: include params 11792 1727096143.92589: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:55:43 -0400 (0:00:00.058) 0:00:26.205 ****** 11792 1727096143.92622: entering _queue_task() for managed_node2/command 11792 1727096143.93188: worker is 1 (out of 1 available) 11792 1727096143.93198: exiting _queue_task() for managed_node2/command 11792 1727096143.93209: done queuing things up, now waiting for results queue to drain 11792 1727096143.93210: waiting for pending results... 11792 1727096143.93297: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 11792 1727096143.93426: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005bc 11792 1727096143.93453: variable 'ansible_search_path' from source: unknown 11792 1727096143.93460: variable 'ansible_search_path' from source: unknown 11792 1727096143.93499: calling self._execute() 11792 1727096143.93608: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096143.93631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096143.93763: variable 'omit' from source: magic vars 11792 1727096143.94106: variable 'ansible_distribution_major_version' from source: facts 11792 1727096143.94125: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096143.94251: variable 'profile_stat' from source: set_fact 11792 1727096143.94271: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096143.94281: when evaluation is False, skipping this task 11792 1727096143.94289: _execute() done 11792 1727096143.94298: dumping result to json 11792 1727096143.94309: done dumping result, returning 11792 1727096143.94319: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0afff68d-5257-d9c7-3fc0-0000000005bc] 11792 1727096143.94326: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005bc skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096143.94594: no more pending results, returning what we have 11792 1727096143.94598: results queue empty 11792 1727096143.94599: checking for any_errors_fatal 11792 1727096143.94610: done checking for any_errors_fatal 11792 1727096143.94611: checking for max_fail_percentage 11792 1727096143.94613: done checking for max_fail_percentage 11792 1727096143.94615: checking to see if all hosts have failed and the running result is not ok 11792 1727096143.94615: done checking to see if all hosts have failed 11792 1727096143.94616: getting the remaining hosts for this loop 11792 1727096143.94618: done getting the remaining hosts for this loop 11792 1727096143.94623: getting the next task for host managed_node2 11792 1727096143.94633: done getting next task for host managed_node2 11792 1727096143.94636: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11792 1727096143.94644: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096143.94648: getting variables 11792 1727096143.94650: in VariableManager get_vars() 11792 1727096143.94711: Calling all_inventory to load vars for managed_node2 11792 1727096143.94714: Calling groups_inventory to load vars for managed_node2 11792 1727096143.94718: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096143.94763: Calling all_plugins_play to load vars for managed_node2 11792 1727096143.94770: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096143.94773: Calling groups_plugins_play to load vars for managed_node2 11792 1727096143.95315: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005bc 11792 1727096143.95319: WORKER PROCESS EXITING 11792 1727096143.97074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096143.99349: done with get_vars() 11792 1727096143.99384: done getting variables 11792 1727096143.99564: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096143.99798: variable 'profile' from source: include params 11792 1727096143.99802: variable 'bond_port_profile' from source: include params 11792 1727096143.99985: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:55:43 -0400 (0:00:00.073) 0:00:26.279 ****** 11792 1727096144.00018: entering _queue_task() for managed_node2/set_fact 11792 1727096144.00682: worker is 1 (out of 1 available) 11792 1727096144.00696: exiting _queue_task() for managed_node2/set_fact 11792 1727096144.00824: done queuing things up, now waiting for results queue to drain 11792 1727096144.00827: waiting for pending results... 11792 1727096144.01006: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11792 1727096144.01218: in run() - task 0afff68d-5257-d9c7-3fc0-0000000005bd 11792 1727096144.01231: variable 'ansible_search_path' from source: unknown 11792 1727096144.01236: variable 'ansible_search_path' from source: unknown 11792 1727096144.01500: calling self._execute() 11792 1727096144.02073: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.02077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.02080: variable 'omit' from source: magic vars 11792 1727096144.02083: variable 'ansible_distribution_major_version' from source: facts 11792 1727096144.02085: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096144.02354: variable 'profile_stat' from source: set_fact 11792 1727096144.02369: Evaluated conditional (profile_stat.stat.exists): False 11792 1727096144.02373: when evaluation is False, skipping this task 11792 1727096144.02376: _execute() done 11792 1727096144.02378: dumping result to json 11792 1727096144.02381: done dumping result, returning 11792 1727096144.02385: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0afff68d-5257-d9c7-3fc0-0000000005bd] 11792 1727096144.02387: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005bd 11792 1727096144.02488: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000005bd 11792 1727096144.02491: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11792 1727096144.02538: no more pending results, returning what we have 11792 1727096144.02542: results queue empty 11792 1727096144.02543: checking for any_errors_fatal 11792 1727096144.02553: done checking for any_errors_fatal 11792 1727096144.02553: checking for max_fail_percentage 11792 1727096144.02555: done checking for max_fail_percentage 11792 1727096144.02556: checking to see if all hosts have failed and the running result is not ok 11792 1727096144.02557: done checking to see if all hosts have failed 11792 1727096144.02558: getting the remaining hosts for this loop 11792 1727096144.02559: done getting the remaining hosts for this loop 11792 1727096144.02563: getting the next task for host managed_node2 11792 1727096144.02581: done getting next task for host managed_node2 11792 1727096144.02584: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11792 1727096144.02590: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096144.02595: getting variables 11792 1727096144.02597: in VariableManager get_vars() 11792 1727096144.02631: Calling all_inventory to load vars for managed_node2 11792 1727096144.02634: Calling groups_inventory to load vars for managed_node2 11792 1727096144.02637: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096144.02650: Calling all_plugins_play to load vars for managed_node2 11792 1727096144.02653: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096144.02656: Calling groups_plugins_play to load vars for managed_node2 11792 1727096144.04369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096144.06263: done with get_vars() 11792 1727096144.06289: done getting variables 11792 1727096144.06355: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096144.06478: variable 'profile' from source: include params 11792 1727096144.06482: variable 'bond_port_profile' from source: include params 11792 1727096144.06539: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:55:44 -0400 (0:00:00.065) 0:00:26.344 ****** 11792 1727096144.06579: entering _queue_task() for managed_node2/assert 11792 1727096144.06937: worker is 1 (out of 1 available) 11792 1727096144.06949: exiting _queue_task() for managed_node2/assert 11792 1727096144.06961: done queuing things up, now waiting for results queue to drain 11792 1727096144.06963: waiting for pending results... 11792 1727096144.07518: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' 11792 1727096144.07779: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004e8 11792 1727096144.08074: variable 'ansible_search_path' from source: unknown 11792 1727096144.08078: variable 'ansible_search_path' from source: unknown 11792 1727096144.08082: calling self._execute() 11792 1727096144.08156: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.08430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.08434: variable 'omit' from source: magic vars 11792 1727096144.09113: variable 'ansible_distribution_major_version' from source: facts 11792 1727096144.09133: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096144.09146: variable 'omit' from source: magic vars 11792 1727096144.09309: variable 'omit' from source: magic vars 11792 1727096144.09463: variable 'profile' from source: include params 11792 1727096144.09730: variable 'bond_port_profile' from source: include params 11792 1727096144.09734: variable 'bond_port_profile' from source: include params 11792 1727096144.09737: variable 'omit' from source: magic vars 11792 1727096144.09885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096144.09934: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096144.10078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096144.10105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.10125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.10165: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096144.10180: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.10191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.10526: Set connection var ansible_timeout to 10 11792 1727096144.10529: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096144.10531: Set connection var ansible_shell_executable to /bin/sh 11792 1727096144.10533: Set connection var ansible_pipelining to False 11792 1727096144.10535: Set connection var ansible_shell_type to sh 11792 1727096144.10537: Set connection var ansible_connection to ssh 11792 1727096144.10547: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.10744: variable 'ansible_connection' from source: unknown 11792 1727096144.10748: variable 'ansible_module_compression' from source: unknown 11792 1727096144.10750: variable 'ansible_shell_type' from source: unknown 11792 1727096144.10756: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.10758: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.10760: variable 'ansible_pipelining' from source: unknown 11792 1727096144.10763: variable 'ansible_timeout' from source: unknown 11792 1727096144.10764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.10948: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096144.11023: variable 'omit' from source: magic vars 11792 1727096144.11038: starting attempt loop 11792 1727096144.11079: running the handler 11792 1727096144.11377: variable 'lsr_net_profile_exists' from source: set_fact 11792 1727096144.11396: Evaluated conditional (lsr_net_profile_exists): True 11792 1727096144.11410: handler run complete 11792 1727096144.11475: attempt loop complete, returning result 11792 1727096144.11484: _execute() done 11792 1727096144.11513: dumping result to json 11792 1727096144.11523: done dumping result, returning 11792 1727096144.11790: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' [0afff68d-5257-d9c7-3fc0-0000000004e8] 11792 1727096144.11794: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e8 11792 1727096144.11885: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e8 11792 1727096144.11889: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096144.11948: no more pending results, returning what we have 11792 1727096144.11953: results queue empty 11792 1727096144.11954: checking for any_errors_fatal 11792 1727096144.11964: done checking for any_errors_fatal 11792 1727096144.11965: checking for max_fail_percentage 11792 1727096144.11971: done checking for max_fail_percentage 11792 1727096144.11972: checking to see if all hosts have failed and the running result is not ok 11792 1727096144.11973: done checking to see if all hosts have failed 11792 1727096144.11974: getting the remaining hosts for this loop 11792 1727096144.11976: done getting the remaining hosts for this loop 11792 1727096144.11979: getting the next task for host managed_node2 11792 1727096144.11988: done getting next task for host managed_node2 11792 1727096144.11991: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11792 1727096144.11995: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096144.12000: getting variables 11792 1727096144.12001: in VariableManager get_vars() 11792 1727096144.12033: Calling all_inventory to load vars for managed_node2 11792 1727096144.12150: Calling groups_inventory to load vars for managed_node2 11792 1727096144.12155: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096144.12170: Calling all_plugins_play to load vars for managed_node2 11792 1727096144.12174: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096144.12178: Calling groups_plugins_play to load vars for managed_node2 11792 1727096144.13746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096144.15372: done with get_vars() 11792 1727096144.15405: done getting variables 11792 1727096144.15481: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096144.15615: variable 'profile' from source: include params 11792 1727096144.15619: variable 'bond_port_profile' from source: include params 11792 1727096144.15685: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:55:44 -0400 (0:00:00.091) 0:00:26.436 ****** 11792 1727096144.15722: entering _queue_task() for managed_node2/assert 11792 1727096144.16208: worker is 1 (out of 1 available) 11792 1727096144.16220: exiting _queue_task() for managed_node2/assert 11792 1727096144.16232: done queuing things up, now waiting for results queue to drain 11792 1727096144.16234: waiting for pending results... 11792 1727096144.16430: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11792 1727096144.16601: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004e9 11792 1727096144.16606: variable 'ansible_search_path' from source: unknown 11792 1727096144.16610: variable 'ansible_search_path' from source: unknown 11792 1727096144.16630: calling self._execute() 11792 1727096144.16729: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.16819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.16824: variable 'omit' from source: magic vars 11792 1727096144.17137: variable 'ansible_distribution_major_version' from source: facts 11792 1727096144.17149: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096144.17156: variable 'omit' from source: magic vars 11792 1727096144.17216: variable 'omit' from source: magic vars 11792 1727096144.17312: variable 'profile' from source: include params 11792 1727096144.17316: variable 'bond_port_profile' from source: include params 11792 1727096144.17377: variable 'bond_port_profile' from source: include params 11792 1727096144.17396: variable 'omit' from source: magic vars 11792 1727096144.17441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096144.17481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096144.17498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096144.17581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.17585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.17588: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096144.17590: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.17592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.17658: Set connection var ansible_timeout to 10 11792 1727096144.17663: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096144.17675: Set connection var ansible_shell_executable to /bin/sh 11792 1727096144.17680: Set connection var ansible_pipelining to False 11792 1727096144.17686: Set connection var ansible_shell_type to sh 11792 1727096144.17689: Set connection var ansible_connection to ssh 11792 1727096144.17706: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.17709: variable 'ansible_connection' from source: unknown 11792 1727096144.17712: variable 'ansible_module_compression' from source: unknown 11792 1727096144.17714: variable 'ansible_shell_type' from source: unknown 11792 1727096144.17717: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.17719: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.17773: variable 'ansible_pipelining' from source: unknown 11792 1727096144.17777: variable 'ansible_timeout' from source: unknown 11792 1727096144.17780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.17864: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096144.17877: variable 'omit' from source: magic vars 11792 1727096144.17884: starting attempt loop 11792 1727096144.17887: running the handler 11792 1727096144.17994: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11792 1727096144.17997: Evaluated conditional (lsr_net_profile_ansible_managed): True 11792 1727096144.18005: handler run complete 11792 1727096144.18072: attempt loop complete, returning result 11792 1727096144.18075: _execute() done 11792 1727096144.18078: dumping result to json 11792 1727096144.18080: done dumping result, returning 11792 1727096144.18083: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0afff68d-5257-d9c7-3fc0-0000000004e9] 11792 1727096144.18085: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e9 11792 1727096144.18150: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004e9 11792 1727096144.18155: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096144.18217: no more pending results, returning what we have 11792 1727096144.18221: results queue empty 11792 1727096144.18222: checking for any_errors_fatal 11792 1727096144.18231: done checking for any_errors_fatal 11792 1727096144.18232: checking for max_fail_percentage 11792 1727096144.18234: done checking for max_fail_percentage 11792 1727096144.18236: checking to see if all hosts have failed and the running result is not ok 11792 1727096144.18236: done checking to see if all hosts have failed 11792 1727096144.18237: getting the remaining hosts for this loop 11792 1727096144.18238: done getting the remaining hosts for this loop 11792 1727096144.18242: getting the next task for host managed_node2 11792 1727096144.18251: done getting next task for host managed_node2 11792 1727096144.18253: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11792 1727096144.18258: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096144.18264: getting variables 11792 1727096144.18265: in VariableManager get_vars() 11792 1727096144.18497: Calling all_inventory to load vars for managed_node2 11792 1727096144.18501: Calling groups_inventory to load vars for managed_node2 11792 1727096144.18504: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096144.18514: Calling all_plugins_play to load vars for managed_node2 11792 1727096144.18517: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096144.18520: Calling groups_plugins_play to load vars for managed_node2 11792 1727096144.20050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096144.21765: done with get_vars() 11792 1727096144.21799: done getting variables 11792 1727096144.21873: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096144.22001: variable 'profile' from source: include params 11792 1727096144.22005: variable 'bond_port_profile' from source: include params 11792 1727096144.22064: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:55:44 -0400 (0:00:00.063) 0:00:26.500 ****** 11792 1727096144.22100: entering _queue_task() for managed_node2/assert 11792 1727096144.22466: worker is 1 (out of 1 available) 11792 1727096144.22483: exiting _queue_task() for managed_node2/assert 11792 1727096144.22501: done queuing things up, now waiting for results queue to drain 11792 1727096144.22503: waiting for pending results... 11792 1727096144.22840: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 11792 1727096144.22893: in run() - task 0afff68d-5257-d9c7-3fc0-0000000004ea 11792 1727096144.22907: variable 'ansible_search_path' from source: unknown 11792 1727096144.22910: variable 'ansible_search_path' from source: unknown 11792 1727096144.22954: calling self._execute() 11792 1727096144.23048: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.23109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.23113: variable 'omit' from source: magic vars 11792 1727096144.23412: variable 'ansible_distribution_major_version' from source: facts 11792 1727096144.23422: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096144.23430: variable 'omit' from source: magic vars 11792 1727096144.23484: variable 'omit' from source: magic vars 11792 1727096144.23582: variable 'profile' from source: include params 11792 1727096144.23587: variable 'bond_port_profile' from source: include params 11792 1727096144.23656: variable 'bond_port_profile' from source: include params 11792 1727096144.23660: variable 'omit' from source: magic vars 11792 1727096144.23761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096144.23765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096144.23770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096144.23772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.23782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.23816: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096144.23820: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.23822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.23927: Set connection var ansible_timeout to 10 11792 1727096144.23937: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096144.23972: Set connection var ansible_shell_executable to /bin/sh 11792 1727096144.23975: Set connection var ansible_pipelining to False 11792 1727096144.23978: Set connection var ansible_shell_type to sh 11792 1727096144.23981: Set connection var ansible_connection to ssh 11792 1727096144.23983: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.23985: variable 'ansible_connection' from source: unknown 11792 1727096144.23987: variable 'ansible_module_compression' from source: unknown 11792 1727096144.23995: variable 'ansible_shell_type' from source: unknown 11792 1727096144.23997: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.23999: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.24001: variable 'ansible_pipelining' from source: unknown 11792 1727096144.24004: variable 'ansible_timeout' from source: unknown 11792 1727096144.24006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.24194: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096144.24198: variable 'omit' from source: magic vars 11792 1727096144.24200: starting attempt loop 11792 1727096144.24203: running the handler 11792 1727096144.24266: variable 'lsr_net_profile_fingerprint' from source: set_fact 11792 1727096144.24271: Evaluated conditional (lsr_net_profile_fingerprint): True 11792 1727096144.24279: handler run complete 11792 1727096144.24294: attempt loop complete, returning result 11792 1727096144.24297: _execute() done 11792 1727096144.24299: dumping result to json 11792 1727096144.24302: done dumping result, returning 11792 1727096144.24322: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 [0afff68d-5257-d9c7-3fc0-0000000004ea] 11792 1727096144.24325: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004ea 11792 1727096144.24497: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000004ea 11792 1727096144.24500: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096144.24544: no more pending results, returning what we have 11792 1727096144.24548: results queue empty 11792 1727096144.24548: checking for any_errors_fatal 11792 1727096144.24566: done checking for any_errors_fatal 11792 1727096144.24569: checking for max_fail_percentage 11792 1727096144.24571: done checking for max_fail_percentage 11792 1727096144.24572: checking to see if all hosts have failed and the running result is not ok 11792 1727096144.24573: done checking to see if all hosts have failed 11792 1727096144.24573: getting the remaining hosts for this loop 11792 1727096144.24574: done getting the remaining hosts for this loop 11792 1727096144.24578: getting the next task for host managed_node2 11792 1727096144.24586: done getting next task for host managed_node2 11792 1727096144.24589: ^ task is: TASK: ** TEST check bond settings 11792 1727096144.24592: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096144.24595: getting variables 11792 1727096144.24596: in VariableManager get_vars() 11792 1727096144.24622: Calling all_inventory to load vars for managed_node2 11792 1727096144.24624: Calling groups_inventory to load vars for managed_node2 11792 1727096144.24628: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096144.24637: Calling all_plugins_play to load vars for managed_node2 11792 1727096144.24639: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096144.24642: Calling groups_plugins_play to load vars for managed_node2 11792 1727096144.25951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096144.27748: done with get_vars() 11792 1727096144.27785: done getting variables 11792 1727096144.27848: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Monday 23 September 2024 08:55:44 -0400 (0:00:00.057) 0:00:26.558 ****** 11792 1727096144.27894: entering _queue_task() for managed_node2/command 11792 1727096144.28257: worker is 1 (out of 1 available) 11792 1727096144.28272: exiting _queue_task() for managed_node2/command 11792 1727096144.28285: done queuing things up, now waiting for results queue to drain 11792 1727096144.28287: waiting for pending results... 11792 1727096144.28660: running TaskExecutor() for managed_node2/TASK: ** TEST check bond settings 11792 1727096144.28758: in run() - task 0afff68d-5257-d9c7-3fc0-000000000400 11792 1727096144.28762: variable 'ansible_search_path' from source: unknown 11792 1727096144.28764: variable 'ansible_search_path' from source: unknown 11792 1727096144.28791: variable 'bond_options_to_assert' from source: play vars 11792 1727096144.28991: variable 'bond_options_to_assert' from source: play vars 11792 1727096144.29188: variable 'omit' from source: magic vars 11792 1727096144.29473: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.29477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.29480: variable 'omit' from source: magic vars 11792 1727096144.29604: variable 'ansible_distribution_major_version' from source: facts 11792 1727096144.29619: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096144.29630: variable 'omit' from source: magic vars 11792 1727096144.29694: variable 'omit' from source: magic vars 11792 1727096144.29964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096144.33074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096144.33147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096144.33216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096144.33258: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096144.33302: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096144.33402: variable 'controller_device' from source: play vars 11792 1727096144.33472: variable 'bond_opt' from source: unknown 11792 1727096144.33476: variable 'omit' from source: magic vars 11792 1727096144.33478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096144.33509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096144.33531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096144.33550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.33564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.33613: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096144.33616: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.33618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.33721: Set connection var ansible_timeout to 10 11792 1727096144.33808: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096144.33811: Set connection var ansible_shell_executable to /bin/sh 11792 1727096144.33813: Set connection var ansible_pipelining to False 11792 1727096144.33815: Set connection var ansible_shell_type to sh 11792 1727096144.33817: Set connection var ansible_connection to ssh 11792 1727096144.33819: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.33821: variable 'ansible_connection' from source: unknown 11792 1727096144.33823: variable 'ansible_module_compression' from source: unknown 11792 1727096144.33826: variable 'ansible_shell_type' from source: unknown 11792 1727096144.33828: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.33830: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.33832: variable 'ansible_pipelining' from source: unknown 11792 1727096144.33833: variable 'ansible_timeout' from source: unknown 11792 1727096144.33835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.33950: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096144.33956: variable 'omit' from source: magic vars 11792 1727096144.33958: starting attempt loop 11792 1727096144.33960: running the handler 11792 1727096144.34025: _low_level_execute_command(): starting 11792 1727096144.34028: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096144.34832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.34917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.34987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096144.35042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096144.35102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.35345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.37054: stdout chunk (state=3): >>>/root <<< 11792 1727096144.37365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.37371: stdout chunk (state=3): >>><<< 11792 1727096144.37375: stderr chunk (state=3): >>><<< 11792 1727096144.37378: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096144.37388: _low_level_execute_command(): starting 11792 1727096144.37391: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219 `" && echo ansible-tmp-1727096144.3726964-13028-196096010646219="` echo /root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219 `" ) && sleep 0' 11792 1727096144.37984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096144.37998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096144.38019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096144.38083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.38139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096144.38156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096144.38184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.38248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.40351: stdout chunk (state=3): >>>ansible-tmp-1727096144.3726964-13028-196096010646219=/root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219 <<< 11792 1727096144.40499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.40777: stderr chunk (state=3): >>><<< 11792 1727096144.40781: stdout chunk (state=3): >>><<< 11792 1727096144.40784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096144.3726964-13028-196096010646219=/root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096144.40787: variable 'ansible_module_compression' from source: unknown 11792 1727096144.40789: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096144.40791: variable 'ansible_facts' from source: unknown 11792 1727096144.40909: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/AnsiballZ_command.py 11792 1727096144.41130: Sending initial data 11792 1727096144.41134: Sent initial data (156 bytes) 11792 1727096144.41784: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096144.41801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096144.41890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.41935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096144.41949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096144.41994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.42062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.43764: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096144.43794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096144.43851: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmps304orvw /root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/AnsiballZ_command.py <<< 11792 1727096144.43857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/AnsiballZ_command.py" <<< 11792 1727096144.44402: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmps304orvw" to remote "/root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/AnsiballZ_command.py" <<< 11792 1727096144.45913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.45917: stderr chunk (state=3): >>><<< 11792 1727096144.45919: stdout chunk (state=3): >>><<< 11792 1727096144.45921: done transferring module to remote 11792 1727096144.45923: _low_level_execute_command(): starting 11792 1727096144.45925: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/ /root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/AnsiballZ_command.py && sleep 0' 11792 1727096144.46906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096144.46919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096144.46930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.47189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096144.47230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.47250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.49090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.49119: stderr chunk (state=3): >>><<< 11792 1727096144.49128: stdout chunk (state=3): >>><<< 11792 1727096144.49155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096144.49188: _low_level_execute_command(): starting 11792 1727096144.49473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/AnsiballZ_command.py && sleep 0' 11792 1727096144.50696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096144.50765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.50821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.66716: stdout chunk (state=3): >>> {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-23 08:55:44.662646", "end": "2024-09-23 08:55:44.665804", "delta": "0:00:00.003158", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096144.68549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096144.68556: stdout chunk (state=3): >>><<< 11792 1727096144.68559: stderr chunk (state=3): >>><<< 11792 1727096144.68580: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-23 08:55:44.662646", "end": "2024-09-23 08:55:44.665804", "delta": "0:00:00.003158", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096144.68615: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096144.68622: _low_level_execute_command(): starting 11792 1727096144.68628: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096144.3726964-13028-196096010646219/ > /dev/null 2>&1 && sleep 0' 11792 1727096144.69789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096144.69840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096144.69862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096144.69889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096144.70118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096144.70154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.70221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.72144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.72182: stdout chunk (state=3): >>><<< 11792 1727096144.72572: stderr chunk (state=3): >>><<< 11792 1727096144.72576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096144.72579: handler run complete 11792 1727096144.72581: Evaluated conditional (False): False 11792 1727096144.72808: variable 'bond_opt' from source: unknown 11792 1727096144.72811: variable 'result' from source: unknown 11792 1727096144.72813: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096144.72815: attempt loop complete, returning result 11792 1727096144.72817: variable 'bond_opt' from source: unknown 11792 1727096144.72909: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'mode', 'value': '802.3ad'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "802.3ad" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003158", "end": "2024-09-23 08:55:44.665804", "rc": 0, "start": "2024-09-23 08:55:44.662646" } STDOUT: 802.3ad 4 11792 1727096144.73580: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.73584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.73586: variable 'omit' from source: magic vars 11792 1727096144.73999: variable 'ansible_distribution_major_version' from source: facts 11792 1727096144.74002: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096144.74005: variable 'omit' from source: magic vars 11792 1727096144.74009: variable 'omit' from source: magic vars 11792 1727096144.74434: variable 'controller_device' from source: play vars 11792 1727096144.74437: variable 'bond_opt' from source: unknown 11792 1727096144.74440: variable 'omit' from source: magic vars 11792 1727096144.74442: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096144.74449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.74453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096144.74561: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096144.74564: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.74567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.74743: Set connection var ansible_timeout to 10 11792 1727096144.74781: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096144.74817: Set connection var ansible_shell_executable to /bin/sh 11792 1727096144.75082: Set connection var ansible_pipelining to False 11792 1727096144.75086: Set connection var ansible_shell_type to sh 11792 1727096144.75088: Set connection var ansible_connection to ssh 11792 1727096144.75090: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.75093: variable 'ansible_connection' from source: unknown 11792 1727096144.75096: variable 'ansible_module_compression' from source: unknown 11792 1727096144.75098: variable 'ansible_shell_type' from source: unknown 11792 1727096144.75100: variable 'ansible_shell_executable' from source: unknown 11792 1727096144.75102: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096144.75104: variable 'ansible_pipelining' from source: unknown 11792 1727096144.75106: variable 'ansible_timeout' from source: unknown 11792 1727096144.75108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096144.75171: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096144.75201: variable 'omit' from source: magic vars 11792 1727096144.75226: starting attempt loop 11792 1727096144.75373: running the handler 11792 1727096144.75377: _low_level_execute_command(): starting 11792 1727096144.75379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096144.76528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096144.76532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.76535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096144.76537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096144.76544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.76695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096144.76712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.77080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.78449: stdout chunk (state=3): >>>/root <<< 11792 1727096144.78542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.78595: stderr chunk (state=3): >>><<< 11792 1727096144.78599: stdout chunk (state=3): >>><<< 11792 1727096144.78719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096144.78728: _low_level_execute_command(): starting 11792 1727096144.78734: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436 `" && echo ansible-tmp-1727096144.787183-13028-190950493433436="` echo /root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436 `" ) && sleep 0' 11792 1727096144.79917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096144.80275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096144.80296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.80361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.82372: stdout chunk (state=3): >>>ansible-tmp-1727096144.787183-13028-190950493433436=/root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436 <<< 11792 1727096144.82521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.82525: stdout chunk (state=3): >>><<< 11792 1727096144.82531: stderr chunk (state=3): >>><<< 11792 1727096144.82550: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096144.787183-13028-190950493433436=/root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096144.82580: variable 'ansible_module_compression' from source: unknown 11792 1727096144.82617: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096144.82640: variable 'ansible_facts' from source: unknown 11792 1727096144.82714: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/AnsiballZ_command.py 11792 1727096144.83466: Sending initial data 11792 1727096144.83471: Sent initial data (155 bytes) 11792 1727096144.84075: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096144.84094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096144.84098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.84155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096144.84284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.84288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.85901: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096144.85942: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096144.85994: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp7p2ye3ql /root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/AnsiballZ_command.py <<< 11792 1727096144.85998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/AnsiballZ_command.py" <<< 11792 1727096144.86049: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp7p2ye3ql" to remote "/root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/AnsiballZ_command.py" <<< 11792 1727096144.86942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.87174: stderr chunk (state=3): >>><<< 11792 1727096144.87178: stdout chunk (state=3): >>><<< 11792 1727096144.87180: done transferring module to remote 11792 1727096144.87182: _low_level_execute_command(): starting 11792 1727096144.87185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/ /root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/AnsiballZ_command.py && sleep 0' 11792 1727096144.87641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096144.87749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096144.87755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096144.87758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096144.87761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096144.87778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096144.87794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.87893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096144.89774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096144.89848: stderr chunk (state=3): >>><<< 11792 1727096144.89854: stdout chunk (state=3): >>><<< 11792 1727096144.89873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096144.89876: _low_level_execute_command(): starting 11792 1727096144.89882: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/AnsiballZ_command.py && sleep 0' 11792 1727096144.90527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096144.90543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096144.90564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096144.90587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096144.90605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096144.90617: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096144.90632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096144.90655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096144.90670: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096144.90684: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096144.90773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096144.90988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096144.91057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.07577: stdout chunk (state=3): >>> {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-23 08:55:45.071007", "end": "2024-09-23 08:55:45.074366", "delta": "0:00:00.003359", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096145.09379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096145.09391: stdout chunk (state=3): >>><<< 11792 1727096145.09439: stderr chunk (state=3): >>><<< 11792 1727096145.09470: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-23 08:55:45.071007", "end": "2024-09-23 08:55:45.074366", "delta": "0:00:00.003359", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096145.09585: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096145.09756: _low_level_execute_command(): starting 11792 1727096145.09759: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096144.787183-13028-190950493433436/ > /dev/null 2>&1 && sleep 0' 11792 1727096145.10364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.10375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.10387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096145.10403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096145.10414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096145.10420: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096145.10473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.10476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096145.10479: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096145.10481: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096145.10483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.10485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096145.10487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096145.10490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096145.10498: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096145.10507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.10570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096145.10593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.10596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.10660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.12562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.12774: stderr chunk (state=3): >>><<< 11792 1727096145.12777: stdout chunk (state=3): >>><<< 11792 1727096145.12780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.12782: handler run complete 11792 1727096145.12785: Evaluated conditional (False): False 11792 1727096145.12874: variable 'bond_opt' from source: unknown 11792 1727096145.12877: variable 'result' from source: unknown 11792 1727096145.12879: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096145.12881: attempt loop complete, returning result 11792 1727096145.12883: variable 'bond_opt' from source: unknown 11792 1727096145.12959: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'ad_actor_sys_prio', 'value': '65535'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_sys_prio", "value": "65535" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio" ], "delta": "0:00:00.003359", "end": "2024-09-23 08:55:45.074366", "rc": 0, "start": "2024-09-23 08:55:45.071007" } STDOUT: 65535 11792 1727096145.13239: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.13243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.13247: variable 'omit' from source: magic vars 11792 1727096145.13573: variable 'ansible_distribution_major_version' from source: facts 11792 1727096145.13576: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096145.13580: variable 'omit' from source: magic vars 11792 1727096145.13583: variable 'omit' from source: magic vars 11792 1727096145.13586: variable 'controller_device' from source: play vars 11792 1727096145.13588: variable 'bond_opt' from source: unknown 11792 1727096145.13590: variable 'omit' from source: magic vars 11792 1727096145.13593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096145.13596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096145.13599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096145.13601: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096145.13604: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.13606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.13773: Set connection var ansible_timeout to 10 11792 1727096145.13776: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096145.13779: Set connection var ansible_shell_executable to /bin/sh 11792 1727096145.13782: Set connection var ansible_pipelining to False 11792 1727096145.13784: Set connection var ansible_shell_type to sh 11792 1727096145.13787: Set connection var ansible_connection to ssh 11792 1727096145.13790: variable 'ansible_shell_executable' from source: unknown 11792 1727096145.13792: variable 'ansible_connection' from source: unknown 11792 1727096145.13795: variable 'ansible_module_compression' from source: unknown 11792 1727096145.13798: variable 'ansible_shell_type' from source: unknown 11792 1727096145.13801: variable 'ansible_shell_executable' from source: unknown 11792 1727096145.13809: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.13811: variable 'ansible_pipelining' from source: unknown 11792 1727096145.13813: variable 'ansible_timeout' from source: unknown 11792 1727096145.13815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.13838: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096145.13846: variable 'omit' from source: magic vars 11792 1727096145.13849: starting attempt loop 11792 1727096145.13851: running the handler 11792 1727096145.13861: _low_level_execute_command(): starting 11792 1727096145.13864: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096145.14676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.14681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.14699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096145.14714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.14732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.14915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.16609: stdout chunk (state=3): >>>/root <<< 11792 1727096145.16813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.16817: stdout chunk (state=3): >>><<< 11792 1727096145.16823: stderr chunk (state=3): >>><<< 11792 1727096145.16966: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.16979: _low_level_execute_command(): starting 11792 1727096145.16985: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318 `" && echo ansible-tmp-1727096145.169661-13028-210786654053318="` echo /root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318 `" ) && sleep 0' 11792 1727096145.18317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.18482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.18595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.18673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.20705: stdout chunk (state=3): >>>ansible-tmp-1727096145.169661-13028-210786654053318=/root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318 <<< 11792 1727096145.20948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.20952: stderr chunk (state=3): >>><<< 11792 1727096145.20955: stdout chunk (state=3): >>><<< 11792 1727096145.20958: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096145.169661-13028-210786654053318=/root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.20960: variable 'ansible_module_compression' from source: unknown 11792 1727096145.20962: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096145.20974: variable 'ansible_facts' from source: unknown 11792 1727096145.21045: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/AnsiballZ_command.py 11792 1727096145.21299: Sending initial data 11792 1727096145.21302: Sent initial data (155 bytes) 11792 1727096145.21873: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.21955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.22246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.22323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.24005: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096145.24072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096145.24109: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmptdnw13bc /root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/AnsiballZ_command.py <<< 11792 1727096145.24113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/AnsiballZ_command.py" <<< 11792 1727096145.24165: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmptdnw13bc" to remote "/root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/AnsiballZ_command.py" <<< 11792 1727096145.25074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.25078: stdout chunk (state=3): >>><<< 11792 1727096145.25080: stderr chunk (state=3): >>><<< 11792 1727096145.25082: done transferring module to remote 11792 1727096145.25084: _low_level_execute_command(): starting 11792 1727096145.25086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/ /root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/AnsiballZ_command.py && sleep 0' 11792 1727096145.25705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.25722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.25762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.25871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.25892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.25979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.27969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.27974: stdout chunk (state=3): >>><<< 11792 1727096145.27979: stderr chunk (state=3): >>><<< 11792 1727096145.28080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.28084: _low_level_execute_command(): starting 11792 1727096145.28087: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/AnsiballZ_command.py && sleep 0' 11792 1727096145.28599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.28608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.28618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096145.28633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096145.28647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096145.28657: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096145.28664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.28682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096145.28689: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096145.28696: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096145.28704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.28714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096145.28727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096145.28874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096145.28888: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096145.28891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.28893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096145.28895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.28898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.28935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.45246: stdout chunk (state=3): >>> {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-23 08:55:45.447544", "end": "2024-09-23 08:55:45.450787", "delta": "0:00:00.003243", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096145.47075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096145.47079: stderr chunk (state=3): >>><<< 11792 1727096145.47081: stdout chunk (state=3): >>><<< 11792 1727096145.47084: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-23 08:55:45.447544", "end": "2024-09-23 08:55:45.450787", "delta": "0:00:00.003243", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096145.47086: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_system', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096145.47089: _low_level_execute_command(): starting 11792 1727096145.47091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096145.169661-13028-210786654053318/ > /dev/null 2>&1 && sleep 0' 11792 1727096145.47673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.47681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.47691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096145.47704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096145.47715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096145.47729: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096145.47739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.47831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096145.47859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.47883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.47932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.49979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.49983: stdout chunk (state=3): >>><<< 11792 1727096145.49986: stderr chunk (state=3): >>><<< 11792 1727096145.49988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.49991: handler run complete 11792 1727096145.49993: Evaluated conditional (False): False 11792 1727096145.50081: variable 'bond_opt' from source: unknown 11792 1727096145.50084: variable 'result' from source: unknown 11792 1727096145.50097: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096145.50111: attempt loop complete, returning result 11792 1727096145.50135: variable 'bond_opt' from source: unknown 11792 1727096145.50204: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'ad_actor_system', 'value': '00:00:5e:00:53:5d'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_system", "value": "00:00:5e:00:53:5d" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_system" ], "delta": "0:00:00.003243", "end": "2024-09-23 08:55:45.450787", "rc": 0, "start": "2024-09-23 08:55:45.447544" } STDOUT: 00:00:5e:00:53:5d 11792 1727096145.50352: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.50356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.50359: variable 'omit' from source: magic vars 11792 1727096145.50544: variable 'ansible_distribution_major_version' from source: facts 11792 1727096145.50547: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096145.50550: variable 'omit' from source: magic vars 11792 1727096145.50552: variable 'omit' from source: magic vars 11792 1727096145.50684: variable 'controller_device' from source: play vars 11792 1727096145.50694: variable 'bond_opt' from source: unknown 11792 1727096145.50713: variable 'omit' from source: magic vars 11792 1727096145.50735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096145.50756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096145.50759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096145.50866: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096145.50871: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.50873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.50876: Set connection var ansible_timeout to 10 11792 1727096145.50878: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096145.50880: Set connection var ansible_shell_executable to /bin/sh 11792 1727096145.50882: Set connection var ansible_pipelining to False 11792 1727096145.50884: Set connection var ansible_shell_type to sh 11792 1727096145.50886: Set connection var ansible_connection to ssh 11792 1727096145.50909: variable 'ansible_shell_executable' from source: unknown 11792 1727096145.50914: variable 'ansible_connection' from source: unknown 11792 1727096145.50917: variable 'ansible_module_compression' from source: unknown 11792 1727096145.50919: variable 'ansible_shell_type' from source: unknown 11792 1727096145.50923: variable 'ansible_shell_executable' from source: unknown 11792 1727096145.50925: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.50930: variable 'ansible_pipelining' from source: unknown 11792 1727096145.50933: variable 'ansible_timeout' from source: unknown 11792 1727096145.50937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.51038: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096145.51046: variable 'omit' from source: magic vars 11792 1727096145.51049: starting attempt loop 11792 1727096145.51051: running the handler 11792 1727096145.51062: _low_level_execute_command(): starting 11792 1727096145.51065: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096145.51723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.51783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.51861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096145.51865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.51883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.51959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.53806: stdout chunk (state=3): >>>/root <<< 11792 1727096145.53810: stdout chunk (state=3): >>><<< 11792 1727096145.53812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.53815: stderr chunk (state=3): >>><<< 11792 1727096145.53843: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.53894: _low_level_execute_command(): starting 11792 1727096145.53898: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415 `" && echo ansible-tmp-1727096145.5384836-13028-173530036489415="` echo /root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415 `" ) && sleep 0' 11792 1727096145.54566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.54633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096145.54672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.54696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.54785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.56790: stdout chunk (state=3): >>>ansible-tmp-1727096145.5384836-13028-173530036489415=/root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415 <<< 11792 1727096145.57185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.57188: stdout chunk (state=3): >>><<< 11792 1727096145.57191: stderr chunk (state=3): >>><<< 11792 1727096145.57193: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096145.5384836-13028-173530036489415=/root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.57195: variable 'ansible_module_compression' from source: unknown 11792 1727096145.57197: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096145.57199: variable 'ansible_facts' from source: unknown 11792 1727096145.57201: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/AnsiballZ_command.py 11792 1727096145.57389: Sending initial data 11792 1727096145.57393: Sent initial data (156 bytes) 11792 1727096145.57907: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.57915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.57925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096145.57939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096145.57950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096145.57956: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096145.57966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.57982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096145.57988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096145.57995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096145.58002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.58201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.58303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.58337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.60005: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096145.60080: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096145.60083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpetiiccll /root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/AnsiballZ_command.py <<< 11792 1727096145.60086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/AnsiballZ_command.py" <<< 11792 1727096145.60143: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpetiiccll" to remote "/root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/AnsiballZ_command.py" <<< 11792 1727096145.60880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.61042: stderr chunk (state=3): >>><<< 11792 1727096145.61045: stdout chunk (state=3): >>><<< 11792 1727096145.61058: done transferring module to remote 11792 1727096145.61076: _low_level_execute_command(): starting 11792 1727096145.61113: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/ /root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/AnsiballZ_command.py && sleep 0' 11792 1727096145.62029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.62086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.62165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096145.62205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.62255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.62410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.64375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.64379: stderr chunk (state=3): >>><<< 11792 1727096145.64381: stdout chunk (state=3): >>><<< 11792 1727096145.64383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.64386: _low_level_execute_command(): starting 11792 1727096145.64388: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/AnsiballZ_command.py && sleep 0' 11792 1727096145.65189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.65206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.65223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096145.65239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096145.65256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096145.65266: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096145.65355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.65789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.65931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.82077: stdout chunk (state=3): >>> {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-23 08:55:45.815821", "end": "2024-09-23 08:55:45.819334", "delta": "0:00:00.003513", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096145.84182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096145.84187: stdout chunk (state=3): >>><<< 11792 1727096145.84190: stderr chunk (state=3): >>><<< 11792 1727096145.84193: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-23 08:55:45.815821", "end": "2024-09-23 08:55:45.819334", "delta": "0:00:00.003513", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096145.84196: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_select', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096145.84198: _low_level_execute_command(): starting 11792 1727096145.84200: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096145.5384836-13028-173530036489415/ > /dev/null 2>&1 && sleep 0' 11792 1727096145.85211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.85216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.85227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096145.85342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.85472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096145.85485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.85604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.87693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.88075: stderr chunk (state=3): >>><<< 11792 1727096145.88079: stdout chunk (state=3): >>><<< 11792 1727096145.88081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.88083: handler run complete 11792 1727096145.88085: Evaluated conditional (False): False 11792 1727096145.88087: variable 'bond_opt' from source: unknown 11792 1727096145.88375: variable 'result' from source: unknown 11792 1727096145.88379: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096145.88381: attempt loop complete, returning result 11792 1727096145.88383: variable 'bond_opt' from source: unknown 11792 1727096145.88385: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'ad_select', 'value': 'stable'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_select", "value": "stable" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_select" ], "delta": "0:00:00.003513", "end": "2024-09-23 08:55:45.819334", "rc": 0, "start": "2024-09-23 08:55:45.815821" } STDOUT: stable 0 11792 1727096145.88738: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.88833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.88836: variable 'omit' from source: magic vars 11792 1727096145.89383: variable 'ansible_distribution_major_version' from source: facts 11792 1727096145.89386: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096145.89388: variable 'omit' from source: magic vars 11792 1727096145.89390: variable 'omit' from source: magic vars 11792 1727096145.89585: variable 'controller_device' from source: play vars 11792 1727096145.89701: variable 'bond_opt' from source: unknown 11792 1727096145.89708: variable 'omit' from source: magic vars 11792 1727096145.89735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096145.89749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096145.89918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096145.89922: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096145.89924: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.89926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.90040: Set connection var ansible_timeout to 10 11792 1727096145.90057: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096145.90073: Set connection var ansible_shell_executable to /bin/sh 11792 1727096145.90084: Set connection var ansible_pipelining to False 11792 1727096145.90111: Set connection var ansible_shell_type to sh 11792 1727096145.90141: Set connection var ansible_connection to ssh 11792 1727096145.90172: variable 'ansible_shell_executable' from source: unknown 11792 1727096145.90355: variable 'ansible_connection' from source: unknown 11792 1727096145.90359: variable 'ansible_module_compression' from source: unknown 11792 1727096145.90361: variable 'ansible_shell_type' from source: unknown 11792 1727096145.90363: variable 'ansible_shell_executable' from source: unknown 11792 1727096145.90365: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096145.90368: variable 'ansible_pipelining' from source: unknown 11792 1727096145.90371: variable 'ansible_timeout' from source: unknown 11792 1727096145.90373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096145.90492: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096145.90528: variable 'omit' from source: magic vars 11792 1727096145.90561: starting attempt loop 11792 1727096145.90593: running the handler 11792 1727096145.90620: _low_level_execute_command(): starting 11792 1727096145.90813: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096145.91947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096145.91950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096145.91997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.92000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096145.92002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096145.92189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.92192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.92243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.94110: stdout chunk (state=3): >>>/root <<< 11792 1727096145.94190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.94485: stderr chunk (state=3): >>><<< 11792 1727096145.94493: stdout chunk (state=3): >>><<< 11792 1727096145.94496: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.94498: _low_level_execute_command(): starting 11792 1727096145.94500: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229 `" && echo ansible-tmp-1727096145.9440324-13028-70399182960229="` echo /root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229 `" ) && sleep 0' 11792 1727096145.95612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096145.95884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096145.95902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096145.95975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096145.98020: stdout chunk (state=3): >>>ansible-tmp-1727096145.9440324-13028-70399182960229=/root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229 <<< 11792 1727096145.98474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096145.98477: stdout chunk (state=3): >>><<< 11792 1727096145.98480: stderr chunk (state=3): >>><<< 11792 1727096145.98482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096145.9440324-13028-70399182960229=/root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096145.98484: variable 'ansible_module_compression' from source: unknown 11792 1727096145.98485: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096145.98487: variable 'ansible_facts' from source: unknown 11792 1727096145.98489: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/AnsiballZ_command.py 11792 1727096145.98930: Sending initial data 11792 1727096145.98941: Sent initial data (155 bytes) 11792 1727096146.00070: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096146.00085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096146.00135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.00283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.00388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.00444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.02116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096146.02249: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096146.02321: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpup5fafjq /root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/AnsiballZ_command.py <<< 11792 1727096146.02333: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpup5fafjq" to remote "/root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/AnsiballZ_command.py" <<< 11792 1727096146.03824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.03829: stderr chunk (state=3): >>><<< 11792 1727096146.03831: stdout chunk (state=3): >>><<< 11792 1727096146.03856: done transferring module to remote 11792 1727096146.03859: _low_level_execute_command(): starting 11792 1727096146.03866: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/ /root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/AnsiballZ_command.py && sleep 0' 11792 1727096146.04900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.04905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.05090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.05305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.07008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.07054: stderr chunk (state=3): >>><<< 11792 1727096146.07057: stdout chunk (state=3): >>><<< 11792 1727096146.07075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.07078: _low_level_execute_command(): starting 11792 1727096146.07083: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/AnsiballZ_command.py && sleep 0' 11792 1727096146.08377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.08542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096146.08546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.08548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.08551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.24616: stdout chunk (state=3): >>> {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-23 08:55:46.241446", "end": "2024-09-23 08:55:46.244579", "delta": "0:00:00.003133", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096146.26570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096146.26573: stdout chunk (state=3): >>><<< 11792 1727096146.26575: stderr chunk (state=3): >>><<< 11792 1727096146.26577: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-23 08:55:46.241446", "end": "2024-09-23 08:55:46.244579", "delta": "0:00:00.003133", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096146.26579: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_user_port_key', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096146.26585: _low_level_execute_command(): starting 11792 1727096146.26587: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096145.9440324-13028-70399182960229/ > /dev/null 2>&1 && sleep 0' 11792 1727096146.27331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.27518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.27522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096146.27524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096146.27526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096146.27528: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096146.27531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.27534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096146.27536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096146.27539: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096146.27542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.27544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096146.27547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.27549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.27684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.29573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.29577: stdout chunk (state=3): >>><<< 11792 1727096146.29579: stderr chunk (state=3): >>><<< 11792 1727096146.29774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.29778: handler run complete 11792 1727096146.29780: Evaluated conditional (False): False 11792 1727096146.29783: variable 'bond_opt' from source: unknown 11792 1727096146.29784: variable 'result' from source: unknown 11792 1727096146.29801: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096146.29817: attempt loop complete, returning result 11792 1727096146.29842: variable 'bond_opt' from source: unknown 11792 1727096146.29921: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'ad_user_port_key', 'value': '1023'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_user_port_key", "value": "1023" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key" ], "delta": "0:00:00.003133", "end": "2024-09-23 08:55:46.244579", "rc": 0, "start": "2024-09-23 08:55:46.241446" } STDOUT: 1023 11792 1727096146.30273: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096146.30277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096146.30280: variable 'omit' from source: magic vars 11792 1727096146.30453: variable 'ansible_distribution_major_version' from source: facts 11792 1727096146.30456: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096146.30458: variable 'omit' from source: magic vars 11792 1727096146.30461: variable 'omit' from source: magic vars 11792 1727096146.30550: variable 'controller_device' from source: play vars 11792 1727096146.30574: variable 'bond_opt' from source: unknown 11792 1727096146.30598: variable 'omit' from source: magic vars 11792 1727096146.30623: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096146.30638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096146.30649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096146.30679: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096146.30778: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096146.30781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096146.30784: Set connection var ansible_timeout to 10 11792 1727096146.30791: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096146.30806: Set connection var ansible_shell_executable to /bin/sh 11792 1727096146.30816: Set connection var ansible_pipelining to False 11792 1727096146.30822: Set connection var ansible_shell_type to sh 11792 1727096146.30829: Set connection var ansible_connection to ssh 11792 1727096146.30852: variable 'ansible_shell_executable' from source: unknown 11792 1727096146.30873: variable 'ansible_connection' from source: unknown 11792 1727096146.30876: variable 'ansible_module_compression' from source: unknown 11792 1727096146.30885: variable 'ansible_shell_type' from source: unknown 11792 1727096146.30887: variable 'ansible_shell_executable' from source: unknown 11792 1727096146.30897: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096146.30973: variable 'ansible_pipelining' from source: unknown 11792 1727096146.30976: variable 'ansible_timeout' from source: unknown 11792 1727096146.30978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096146.31030: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096146.31043: variable 'omit' from source: magic vars 11792 1727096146.31050: starting attempt loop 11792 1727096146.31057: running the handler 11792 1727096146.31069: _low_level_execute_command(): starting 11792 1727096146.31078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096146.31915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.31936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.31981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.32055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096146.32084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.32097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.32169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.33877: stdout chunk (state=3): >>>/root <<< 11792 1727096146.34035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.34039: stdout chunk (state=3): >>><<< 11792 1727096146.34042: stderr chunk (state=3): >>><<< 11792 1727096146.34064: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.34081: _low_level_execute_command(): starting 11792 1727096146.34159: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319 `" && echo ansible-tmp-1727096146.34071-13028-92320241046319="` echo /root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319 `" ) && sleep 0' 11792 1727096146.34716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.34730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.34741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096146.34822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.34856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096146.34874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.34896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.34961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.37057: stdout chunk (state=3): >>>ansible-tmp-1727096146.34071-13028-92320241046319=/root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319 <<< 11792 1727096146.37374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.37378: stdout chunk (state=3): >>><<< 11792 1727096146.37381: stderr chunk (state=3): >>><<< 11792 1727096146.37384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096146.34071-13028-92320241046319=/root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.37386: variable 'ansible_module_compression' from source: unknown 11792 1727096146.37388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096146.37395: variable 'ansible_facts' from source: unknown 11792 1727096146.37398: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/AnsiballZ_command.py 11792 1727096146.37521: Sending initial data 11792 1727096146.37531: Sent initial data (153 bytes) 11792 1727096146.38151: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.38172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.38222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096146.38241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.38322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096146.38356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.38450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.40105: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096146.40213: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096146.40263: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpewkb_rbl /root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/AnsiballZ_command.py <<< 11792 1727096146.40278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/AnsiballZ_command.py" <<< 11792 1727096146.40406: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpewkb_rbl" to remote "/root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/AnsiballZ_command.py" <<< 11792 1727096146.41666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.41680: stdout chunk (state=3): >>><<< 11792 1727096146.41690: stderr chunk (state=3): >>><<< 11792 1727096146.41944: done transferring module to remote 11792 1727096146.41948: _low_level_execute_command(): starting 11792 1727096146.41950: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/ /root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/AnsiballZ_command.py && sleep 0' 11792 1727096146.43174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.43178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.43181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096146.43183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096146.43185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096146.43187: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096146.43189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.43191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096146.43354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.43372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.43530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.45363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.45369: stdout chunk (state=3): >>><<< 11792 1727096146.45583: stderr chunk (state=3): >>><<< 11792 1727096146.45603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.45606: _low_level_execute_command(): starting 11792 1727096146.45609: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/AnsiballZ_command.py && sleep 0' 11792 1727096146.46943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096146.46961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.46974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.47049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.63021: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-23 08:55:46.625396", "end": "2024-09-23 08:55:46.628772", "delta": "0:00:00.003376", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096146.64662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.64715: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 11792 1727096146.64719: stdout chunk (state=3): >>><<< 11792 1727096146.64725: stderr chunk (state=3): >>><<< 11792 1727096146.64742: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-23 08:55:46.625396", "end": "2024-09-23 08:55:46.628772", "delta": "0:00:00.003376", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096146.64777: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/all_slaves_active', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096146.64780: _low_level_execute_command(): starting 11792 1727096146.64787: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096146.34071-13028-92320241046319/ > /dev/null 2>&1 && sleep 0' 11792 1727096146.65370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.65379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.65391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096146.65405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096146.65418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096146.65425: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096146.65549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.65553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096146.65555: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096146.65557: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096146.65559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.65561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096146.65563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096146.65571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.65597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.65666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.67576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.67597: stdout chunk (state=3): >>><<< 11792 1727096146.67611: stderr chunk (state=3): >>><<< 11792 1727096146.67634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.67674: handler run complete 11792 1727096146.67677: Evaluated conditional (False): False 11792 1727096146.67834: variable 'bond_opt' from source: unknown 11792 1727096146.67846: variable 'result' from source: unknown 11792 1727096146.67864: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096146.67896: attempt loop complete, returning result 11792 1727096146.67973: variable 'bond_opt' from source: unknown 11792 1727096146.68001: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'all_slaves_active', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "all_slaves_active", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/all_slaves_active" ], "delta": "0:00:00.003376", "end": "2024-09-23 08:55:46.628772", "rc": 0, "start": "2024-09-23 08:55:46.625396" } STDOUT: 1 11792 1727096146.68256: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096146.68260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096146.68262: variable 'omit' from source: magic vars 11792 1727096146.68594: variable 'ansible_distribution_major_version' from source: facts 11792 1727096146.68598: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096146.68605: variable 'omit' from source: magic vars 11792 1727096146.68608: variable 'omit' from source: magic vars 11792 1727096146.68672: variable 'controller_device' from source: play vars 11792 1727096146.68676: variable 'bond_opt' from source: unknown 11792 1727096146.68679: variable 'omit' from source: magic vars 11792 1727096146.68681: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096146.68687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096146.68704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096146.68730: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096146.68739: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096146.68747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096146.68833: Set connection var ansible_timeout to 10 11792 1727096146.68845: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096146.68857: Set connection var ansible_shell_executable to /bin/sh 11792 1727096146.68865: Set connection var ansible_pipelining to False 11792 1727096146.68917: Set connection var ansible_shell_type to sh 11792 1727096146.68919: Set connection var ansible_connection to ssh 11792 1727096146.68921: variable 'ansible_shell_executable' from source: unknown 11792 1727096146.68923: variable 'ansible_connection' from source: unknown 11792 1727096146.68925: variable 'ansible_module_compression' from source: unknown 11792 1727096146.68926: variable 'ansible_shell_type' from source: unknown 11792 1727096146.68928: variable 'ansible_shell_executable' from source: unknown 11792 1727096146.68930: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096146.68931: variable 'ansible_pipelining' from source: unknown 11792 1727096146.68933: variable 'ansible_timeout' from source: unknown 11792 1727096146.68941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096146.69022: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096146.69035: variable 'omit' from source: magic vars 11792 1727096146.69046: starting attempt loop 11792 1727096146.69055: running the handler 11792 1727096146.69130: _low_level_execute_command(): starting 11792 1727096146.69133: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096146.69686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.69706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.69784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.69832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096146.69851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.69875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.69949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.71649: stdout chunk (state=3): >>>/root <<< 11792 1727096146.71786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.71810: stdout chunk (state=3): >>><<< 11792 1727096146.71827: stderr chunk (state=3): >>><<< 11792 1727096146.71933: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.71937: _low_level_execute_command(): starting 11792 1727096146.71939: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211 `" && echo ansible-tmp-1727096146.7185013-13028-114124460929211="` echo /root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211 `" ) && sleep 0' 11792 1727096146.72561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.72572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.72638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.74599: stdout chunk (state=3): >>>ansible-tmp-1727096146.7185013-13028-114124460929211=/root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211 <<< 11792 1727096146.74743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.74758: stderr chunk (state=3): >>><<< 11792 1727096146.74770: stdout chunk (state=3): >>><<< 11792 1727096146.74875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096146.7185013-13028-114124460929211=/root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.74878: variable 'ansible_module_compression' from source: unknown 11792 1727096146.74881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096146.74900: variable 'ansible_facts' from source: unknown 11792 1727096146.74979: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/AnsiballZ_command.py 11792 1727096146.75156: Sending initial data 11792 1727096146.75160: Sent initial data (156 bytes) 11792 1727096146.75808: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.75824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.75885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.75956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096146.75975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.76031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.76076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.77712: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096146.77754: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096146.77790: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096146.77812: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpnnzbz16c /root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/AnsiballZ_command.py <<< 11792 1727096146.77845: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/AnsiballZ_command.py" <<< 11792 1727096146.77880: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpnnzbz16c" to remote "/root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/AnsiballZ_command.py" <<< 11792 1727096146.78846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.78849: stdout chunk (state=3): >>><<< 11792 1727096146.78851: stderr chunk (state=3): >>><<< 11792 1727096146.78872: done transferring module to remote 11792 1727096146.78880: _low_level_execute_command(): starting 11792 1727096146.78885: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/ /root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/AnsiballZ_command.py && sleep 0' 11792 1727096146.79554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096146.79567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.79608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.79642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.81576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096146.81785: stderr chunk (state=3): >>><<< 11792 1727096146.81789: stdout chunk (state=3): >>><<< 11792 1727096146.81809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096146.81817: _low_level_execute_command(): starting 11792 1727096146.81819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/AnsiballZ_command.py && sleep 0' 11792 1727096146.82645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096146.82676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096146.82679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096146.82682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096146.82691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096146.82697: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096146.82705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096146.82835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096146.82881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096146.82885: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096146.82888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096146.82890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096146.99173: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-23 08:55:46.986885", "end": "2024-09-23 08:55:46.990104", "delta": "0:00:00.003219", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096147.00998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096147.01002: stdout chunk (state=3): >>><<< 11792 1727096147.01005: stderr chunk (state=3): >>><<< 11792 1727096147.01196: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-23 08:55:46.986885", "end": "2024-09-23 08:55:46.990104", "delta": "0:00:00.003219", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096147.01200: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/downdelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096147.01203: _low_level_execute_command(): starting 11792 1727096147.01205: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096146.7185013-13028-114124460929211/ > /dev/null 2>&1 && sleep 0' 11792 1727096147.02089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.02093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.02105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.02117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.02182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.04271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.04276: stdout chunk (state=3): >>><<< 11792 1727096147.04278: stderr chunk (state=3): >>><<< 11792 1727096147.04511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.04518: handler run complete 11792 1727096147.04521: Evaluated conditional (False): False 11792 1727096147.04742: variable 'bond_opt' from source: unknown 11792 1727096147.04745: variable 'result' from source: unknown 11792 1727096147.04747: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096147.04749: attempt loop complete, returning result 11792 1727096147.04782: variable 'bond_opt' from source: unknown 11792 1727096147.04855: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'downdelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "downdelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/downdelay" ], "delta": "0:00:00.003219", "end": "2024-09-23 08:55:46.990104", "rc": 0, "start": "2024-09-23 08:55:46.986885" } STDOUT: 0 11792 1727096147.05009: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.05013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.05016: variable 'omit' from source: magic vars 11792 1727096147.05488: variable 'ansible_distribution_major_version' from source: facts 11792 1727096147.05491: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096147.05493: variable 'omit' from source: magic vars 11792 1727096147.05495: variable 'omit' from source: magic vars 11792 1727096147.05497: variable 'controller_device' from source: play vars 11792 1727096147.05499: variable 'bond_opt' from source: unknown 11792 1727096147.05501: variable 'omit' from source: magic vars 11792 1727096147.05504: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096147.05506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096147.05512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096147.05532: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096147.05534: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.05537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.05629: Set connection var ansible_timeout to 10 11792 1727096147.05642: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096147.05651: Set connection var ansible_shell_executable to /bin/sh 11792 1727096147.05660: Set connection var ansible_pipelining to False 11792 1727096147.05662: Set connection var ansible_shell_type to sh 11792 1727096147.05664: Set connection var ansible_connection to ssh 11792 1727096147.05693: variable 'ansible_shell_executable' from source: unknown 11792 1727096147.05696: variable 'ansible_connection' from source: unknown 11792 1727096147.05699: variable 'ansible_module_compression' from source: unknown 11792 1727096147.05701: variable 'ansible_shell_type' from source: unknown 11792 1727096147.05703: variable 'ansible_shell_executable' from source: unknown 11792 1727096147.05706: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.05710: variable 'ansible_pipelining' from source: unknown 11792 1727096147.05713: variable 'ansible_timeout' from source: unknown 11792 1727096147.05717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.05829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096147.05836: variable 'omit' from source: magic vars 11792 1727096147.05839: starting attempt loop 11792 1727096147.05842: running the handler 11792 1727096147.05854: _low_level_execute_command(): starting 11792 1727096147.05861: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096147.06593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.06618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.06655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.06730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.06734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.06780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.06861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.08569: stdout chunk (state=3): >>>/root <<< 11792 1727096147.08720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.08724: stdout chunk (state=3): >>><<< 11792 1727096147.08726: stderr chunk (state=3): >>><<< 11792 1727096147.08748: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.08803: _low_level_execute_command(): starting 11792 1727096147.08806: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469 `" && echo ansible-tmp-1727096147.087564-13028-80329558030469="` echo /root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469 `" ) && sleep 0' 11792 1727096147.09582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.09587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.09644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.09697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.11698: stdout chunk (state=3): >>>ansible-tmp-1727096147.087564-13028-80329558030469=/root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469 <<< 11792 1727096147.11990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.11995: stdout chunk (state=3): >>><<< 11792 1727096147.11998: stderr chunk (state=3): >>><<< 11792 1727096147.12000: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096147.087564-13028-80329558030469=/root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.12002: variable 'ansible_module_compression' from source: unknown 11792 1727096147.12004: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096147.12006: variable 'ansible_facts' from source: unknown 11792 1727096147.12008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/AnsiballZ_command.py 11792 1727096147.12221: Sending initial data 11792 1727096147.12224: Sent initial data (154 bytes) 11792 1727096147.12864: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.12870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.12872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.12874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.12877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.12882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.12947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.14623: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096147.14657: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096147.14685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096147.14763: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp1by_b_5p /root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/AnsiballZ_command.py <<< 11792 1727096147.14767: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/AnsiballZ_command.py" <<< 11792 1727096147.14799: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp1by_b_5p" to remote "/root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/AnsiballZ_command.py" <<< 11792 1727096147.15660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.15664: stderr chunk (state=3): >>><<< 11792 1727096147.15667: stdout chunk (state=3): >>><<< 11792 1727096147.15678: done transferring module to remote 11792 1727096147.15689: _low_level_execute_command(): starting 11792 1727096147.15696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/ /root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/AnsiballZ_command.py && sleep 0' 11792 1727096147.16343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.16363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.16381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.16424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096147.16437: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096147.16449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.16535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.16557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.16576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.16600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.16664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.18657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.18661: stdout chunk (state=3): >>><<< 11792 1727096147.18664: stderr chunk (state=3): >>><<< 11792 1727096147.18770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.18776: _low_level_execute_command(): starting 11792 1727096147.18779: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/AnsiballZ_command.py && sleep 0' 11792 1727096147.19320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.19323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.19326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.19329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.19341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096147.19430: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.19433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.19448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.19487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.19539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.35778: stdout chunk (state=3): >>> {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-23 08:55:47.351548", "end": "2024-09-23 08:55:47.354826", "delta": "0:00:00.003278", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096147.37300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096147.37304: stdout chunk (state=3): >>><<< 11792 1727096147.37312: stderr chunk (state=3): >>><<< 11792 1727096147.37334: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-23 08:55:47.351548", "end": "2024-09-23 08:55:47.354826", "delta": "0:00:00.003278", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096147.37363: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lacp_rate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096147.37367: _low_level_execute_command(): starting 11792 1727096147.37376: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096147.087564-13028-80329558030469/ > /dev/null 2>&1 && sleep 0' 11792 1727096147.38080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.38272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.38275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.38278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.38285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.38288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.38300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.38401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.40330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.40341: stdout chunk (state=3): >>><<< 11792 1727096147.40356: stderr chunk (state=3): >>><<< 11792 1727096147.40389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.40401: handler run complete 11792 1727096147.40429: Evaluated conditional (False): False 11792 1727096147.40612: variable 'bond_opt' from source: unknown 11792 1727096147.40624: variable 'result' from source: unknown 11792 1727096147.40643: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096147.40666: attempt loop complete, returning result 11792 1727096147.40695: variable 'bond_opt' from source: unknown 11792 1727096147.40778: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'lacp_rate', 'value': 'slow'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lacp_rate", "value": "slow" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lacp_rate" ], "delta": "0:00:00.003278", "end": "2024-09-23 08:55:47.354826", "rc": 0, "start": "2024-09-23 08:55:47.351548" } STDOUT: slow 0 11792 1727096147.41049: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.41055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.41057: variable 'omit' from source: magic vars 11792 1727096147.41210: variable 'ansible_distribution_major_version' from source: facts 11792 1727096147.41222: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096147.41231: variable 'omit' from source: magic vars 11792 1727096147.41250: variable 'omit' from source: magic vars 11792 1727096147.41433: variable 'controller_device' from source: play vars 11792 1727096147.41444: variable 'bond_opt' from source: unknown 11792 1727096147.41488: variable 'omit' from source: magic vars 11792 1727096147.41508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096147.41573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096147.41576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096147.41578: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096147.41581: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.41583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.41654: Set connection var ansible_timeout to 10 11792 1727096147.41672: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096147.41687: Set connection var ansible_shell_executable to /bin/sh 11792 1727096147.41706: Set connection var ansible_pipelining to False 11792 1727096147.41715: Set connection var ansible_shell_type to sh 11792 1727096147.41722: Set connection var ansible_connection to ssh 11792 1727096147.41748: variable 'ansible_shell_executable' from source: unknown 11792 1727096147.41815: variable 'ansible_connection' from source: unknown 11792 1727096147.41818: variable 'ansible_module_compression' from source: unknown 11792 1727096147.41821: variable 'ansible_shell_type' from source: unknown 11792 1727096147.41823: variable 'ansible_shell_executable' from source: unknown 11792 1727096147.41825: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.41827: variable 'ansible_pipelining' from source: unknown 11792 1727096147.41830: variable 'ansible_timeout' from source: unknown 11792 1727096147.41831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.41902: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096147.41961: variable 'omit' from source: magic vars 11792 1727096147.42004: starting attempt loop 11792 1727096147.42120: running the handler 11792 1727096147.42122: _low_level_execute_command(): starting 11792 1727096147.42125: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096147.42971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.42988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.43004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.43034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.43141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.43165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.43250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.44930: stdout chunk (state=3): >>>/root <<< 11792 1727096147.45021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.45090: stderr chunk (state=3): >>><<< 11792 1727096147.45109: stdout chunk (state=3): >>><<< 11792 1727096147.45133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.45154: _low_level_execute_command(): starting 11792 1727096147.45165: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497 `" && echo ansible-tmp-1727096147.451423-13028-251888909737497="` echo /root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497 `" ) && sleep 0' 11792 1727096147.45823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.45840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.45857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.45886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.45908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096147.45931: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.46037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.46059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.46149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.48118: stdout chunk (state=3): >>>ansible-tmp-1727096147.451423-13028-251888909737497=/root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497 <<< 11792 1727096147.48259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.48284: stdout chunk (state=3): >>><<< 11792 1727096147.48294: stderr chunk (state=3): >>><<< 11792 1727096147.48313: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096147.451423-13028-251888909737497=/root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.48473: variable 'ansible_module_compression' from source: unknown 11792 1727096147.48476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096147.48478: variable 'ansible_facts' from source: unknown 11792 1727096147.48480: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/AnsiballZ_command.py 11792 1727096147.48615: Sending initial data 11792 1727096147.48624: Sent initial data (155 bytes) 11792 1727096147.49306: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.49320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.49333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.49399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.49673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.49736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.49774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.49809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.51461: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096147.51528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096147.51587: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp76uy0np9 /root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/AnsiballZ_command.py <<< 11792 1727096147.51605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/AnsiballZ_command.py" <<< 11792 1727096147.51650: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp76uy0np9" to remote "/root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/AnsiballZ_command.py" <<< 11792 1727096147.52433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.52484: stderr chunk (state=3): >>><<< 11792 1727096147.52499: stdout chunk (state=3): >>><<< 11792 1727096147.52635: done transferring module to remote 11792 1727096147.52638: _low_level_execute_command(): starting 11792 1727096147.52641: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/ /root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/AnsiballZ_command.py && sleep 0' 11792 1727096147.53270: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.53293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.53405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.53451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.53475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.55537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.55541: stdout chunk (state=3): >>><<< 11792 1727096147.55544: stderr chunk (state=3): >>><<< 11792 1727096147.55547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.55549: _low_level_execute_command(): starting 11792 1727096147.55554: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/AnsiballZ_command.py && sleep 0' 11792 1727096147.56169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.56188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.56205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.56229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.56247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096147.56343: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.56362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.56382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.56405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.56495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.72678: stdout chunk (state=3): >>> {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-23 08:55:47.722126", "end": "2024-09-23 08:55:47.725344", "delta": "0:00:00.003218", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096147.74659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096147.74663: stdout chunk (state=3): >>><<< 11792 1727096147.74665: stderr chunk (state=3): >>><<< 11792 1727096147.74692: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-23 08:55:47.722126", "end": "2024-09-23 08:55:47.725344", "delta": "0:00:00.003218", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096147.74874: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096147.74881: _low_level_execute_command(): starting 11792 1727096147.74884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096147.451423-13028-251888909737497/ > /dev/null 2>&1 && sleep 0' 11792 1727096147.76014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.76018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.76021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.76023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.76227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.76328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.78208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.78241: stderr chunk (state=3): >>><<< 11792 1727096147.78250: stdout chunk (state=3): >>><<< 11792 1727096147.78295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.78674: handler run complete 11792 1727096147.78677: Evaluated conditional (False): False 11792 1727096147.78680: variable 'bond_opt' from source: unknown 11792 1727096147.78682: variable 'result' from source: unknown 11792 1727096147.78774: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096147.78795: attempt loop complete, returning result 11792 1727096147.78818: variable 'bond_opt' from source: unknown 11792 1727096147.78996: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'lp_interval', 'value': '128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lp_interval", "value": "128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lp_interval" ], "delta": "0:00:00.003218", "end": "2024-09-23 08:55:47.725344", "rc": 0, "start": "2024-09-23 08:55:47.722126" } STDOUT: 128 11792 1727096147.79585: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.79588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.79590: variable 'omit' from source: magic vars 11792 1727096147.79710: variable 'ansible_distribution_major_version' from source: facts 11792 1727096147.79814: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096147.79910: variable 'omit' from source: magic vars 11792 1727096147.79913: variable 'omit' from source: magic vars 11792 1727096147.80181: variable 'controller_device' from source: play vars 11792 1727096147.80190: variable 'bond_opt' from source: unknown 11792 1727096147.80213: variable 'omit' from source: magic vars 11792 1727096147.80275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096147.80291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096147.80301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096147.80317: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096147.80351: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.80363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.80458: Set connection var ansible_timeout to 10 11792 1727096147.80685: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096147.80688: Set connection var ansible_shell_executable to /bin/sh 11792 1727096147.80691: Set connection var ansible_pipelining to False 11792 1727096147.80693: Set connection var ansible_shell_type to sh 11792 1727096147.80695: Set connection var ansible_connection to ssh 11792 1727096147.80696: variable 'ansible_shell_executable' from source: unknown 11792 1727096147.80698: variable 'ansible_connection' from source: unknown 11792 1727096147.80700: variable 'ansible_module_compression' from source: unknown 11792 1727096147.80702: variable 'ansible_shell_type' from source: unknown 11792 1727096147.80704: variable 'ansible_shell_executable' from source: unknown 11792 1727096147.80706: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096147.80708: variable 'ansible_pipelining' from source: unknown 11792 1727096147.80710: variable 'ansible_timeout' from source: unknown 11792 1727096147.80712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096147.80855: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096147.81104: variable 'omit' from source: magic vars 11792 1727096147.81111: starting attempt loop 11792 1727096147.81113: running the handler 11792 1727096147.81115: _low_level_execute_command(): starting 11792 1727096147.81117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096147.82306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.82449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096147.82455: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.82462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096147.82465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.82505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.82518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.82582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.84291: stdout chunk (state=3): >>>/root <<< 11792 1727096147.84501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.84534: stderr chunk (state=3): >>><<< 11792 1727096147.84543: stdout chunk (state=3): >>><<< 11792 1727096147.84765: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.84771: _low_level_execute_command(): starting 11792 1727096147.84773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163 `" && echo ansible-tmp-1727096147.8467546-13028-46994395295163="` echo /root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163 `" ) && sleep 0' 11792 1727096147.85885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.85889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.86022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096147.86093: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.86097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.86255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.88234: stdout chunk (state=3): >>>ansible-tmp-1727096147.8467546-13028-46994395295163=/root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163 <<< 11792 1727096147.88334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.88380: stderr chunk (state=3): >>><<< 11792 1727096147.88383: stdout chunk (state=3): >>><<< 11792 1727096147.88430: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096147.8467546-13028-46994395295163=/root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.88434: variable 'ansible_module_compression' from source: unknown 11792 1727096147.88476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096147.88502: variable 'ansible_facts' from source: unknown 11792 1727096147.88691: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/AnsiballZ_command.py 11792 1727096147.88939: Sending initial data 11792 1727096147.88948: Sent initial data (155 bytes) 11792 1727096147.90061: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.90283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.90398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.90416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.90481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.92129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096147.92251: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096147.92274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp907yse9r /root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/AnsiballZ_command.py <<< 11792 1727096147.92285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp907yse9r" to remote "/root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/AnsiballZ_command.py" <<< 11792 1727096147.93988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.94159: stderr chunk (state=3): >>><<< 11792 1727096147.94162: stdout chunk (state=3): >>><<< 11792 1727096147.94188: done transferring module to remote 11792 1727096147.94198: _low_level_execute_command(): starting 11792 1727096147.94201: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/ /root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/AnsiballZ_command.py && sleep 0' 11792 1727096147.95458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.95462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096147.95550: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096147.95559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096147.95565: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.95570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.95572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096147.95574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.95638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.95783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096147.97554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096147.97664: stderr chunk (state=3): >>><<< 11792 1727096147.97669: stdout chunk (state=3): >>><<< 11792 1727096147.97688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096147.97692: _low_level_execute_command(): starting 11792 1727096147.97697: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/AnsiballZ_command.py && sleep 0' 11792 1727096147.98885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096147.98940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096147.98956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096147.98976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096147.98992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096147.99004: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096147.99018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096147.99155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096147.99585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096147.99761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.16034: stdout chunk (state=3): >>> {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-23 08:55:48.155527", "end": "2024-09-23 08:55:48.159017", "delta": "0:00:00.003490", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096148.17844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096148.17848: stdout chunk (state=3): >>><<< 11792 1727096148.17851: stderr chunk (state=3): >>><<< 11792 1727096148.17883: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-23 08:55:48.155527", "end": "2024-09-23 08:55:48.159017", "delta": "0:00:00.003490", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096148.17908: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/miimon', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096148.17911: _low_level_execute_command(): starting 11792 1727096148.17917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096147.8467546-13028-46994395295163/ > /dev/null 2>&1 && sleep 0' 11792 1727096148.18572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096148.18575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.18577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.18592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096148.18605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096148.18612: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096148.18622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.18644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096148.18654: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096148.18657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096148.18671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.18676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.18743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.18783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.18786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.18796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.18871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.20978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.20982: stdout chunk (state=3): >>><<< 11792 1727096148.20985: stderr chunk (state=3): >>><<< 11792 1727096148.20987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.20989: handler run complete 11792 1727096148.20992: Evaluated conditional (False): False 11792 1727096148.21114: variable 'bond_opt' from source: unknown 11792 1727096148.21124: variable 'result' from source: unknown 11792 1727096148.21138: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096148.21150: attempt loop complete, returning result 11792 1727096148.21178: variable 'bond_opt' from source: unknown 11792 1727096148.21251: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'miimon', 'value': '110'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "miimon", "value": "110" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/miimon" ], "delta": "0:00:00.003490", "end": "2024-09-23 08:55:48.159017", "rc": 0, "start": "2024-09-23 08:55:48.155527" } STDOUT: 110 11792 1727096148.21396: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.21399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.21402: variable 'omit' from source: magic vars 11792 1727096148.21880: variable 'ansible_distribution_major_version' from source: facts 11792 1727096148.21885: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096148.21887: variable 'omit' from source: magic vars 11792 1727096148.21890: variable 'omit' from source: magic vars 11792 1727096148.21892: variable 'controller_device' from source: play vars 11792 1727096148.21894: variable 'bond_opt' from source: unknown 11792 1727096148.21896: variable 'omit' from source: magic vars 11792 1727096148.21898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096148.21901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096148.21902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096148.21904: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096148.21907: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.21908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.22086: Set connection var ansible_timeout to 10 11792 1727096148.22092: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096148.22100: Set connection var ansible_shell_executable to /bin/sh 11792 1727096148.22105: Set connection var ansible_pipelining to False 11792 1727096148.22108: Set connection var ansible_shell_type to sh 11792 1727096148.22110: Set connection var ansible_connection to ssh 11792 1727096148.22129: variable 'ansible_shell_executable' from source: unknown 11792 1727096148.22132: variable 'ansible_connection' from source: unknown 11792 1727096148.22134: variable 'ansible_module_compression' from source: unknown 11792 1727096148.22137: variable 'ansible_shell_type' from source: unknown 11792 1727096148.22139: variable 'ansible_shell_executable' from source: unknown 11792 1727096148.22141: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.22147: variable 'ansible_pipelining' from source: unknown 11792 1727096148.22154: variable 'ansible_timeout' from source: unknown 11792 1727096148.22157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.22247: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096148.22257: variable 'omit' from source: magic vars 11792 1727096148.22260: starting attempt loop 11792 1727096148.22263: running the handler 11792 1727096148.22279: _low_level_execute_command(): starting 11792 1727096148.22284: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096148.22944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.22985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.23047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.23163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.23202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.23253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.24993: stdout chunk (state=3): >>>/root <<< 11792 1727096148.25131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.25149: stderr chunk (state=3): >>><<< 11792 1727096148.25165: stdout chunk (state=3): >>><<< 11792 1727096148.25181: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.25274: _low_level_execute_command(): starting 11792 1727096148.25278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802 `" && echo ansible-tmp-1727096148.2518616-13028-16962003861802="` echo /root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802 `" ) && sleep 0' 11792 1727096148.25887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.25908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.25944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.25947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.26002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.28087: stdout chunk (state=3): >>>ansible-tmp-1727096148.2518616-13028-16962003861802=/root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802 <<< 11792 1727096148.28257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.28261: stdout chunk (state=3): >>><<< 11792 1727096148.28264: stderr chunk (state=3): >>><<< 11792 1727096148.28285: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096148.2518616-13028-16962003861802=/root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.28313: variable 'ansible_module_compression' from source: unknown 11792 1727096148.28470: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096148.28473: variable 'ansible_facts' from source: unknown 11792 1727096148.28476: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/AnsiballZ_command.py 11792 1727096148.28612: Sending initial data 11792 1727096148.28622: Sent initial data (155 bytes) 11792 1727096148.29388: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.29425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.29442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.29464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.29542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.31216: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096148.31273: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096148.31473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/AnsiballZ_command.py" <<< 11792 1727096148.31477: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp698sc7d_ /root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/AnsiballZ_command.py <<< 11792 1727096148.31479: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp698sc7d_" to remote "/root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/AnsiballZ_command.py" <<< 11792 1727096148.32981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.32985: stdout chunk (state=3): >>><<< 11792 1727096148.32990: stderr chunk (state=3): >>><<< 11792 1727096148.33059: done transferring module to remote 11792 1727096148.33183: _low_level_execute_command(): starting 11792 1727096148.33190: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/ /root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/AnsiballZ_command.py && sleep 0' 11792 1727096148.34807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.35018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.35045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.35124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.37094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.37187: stdout chunk (state=3): >>><<< 11792 1727096148.37204: stderr chunk (state=3): >>><<< 11792 1727096148.37229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.37239: _low_level_execute_command(): starting 11792 1727096148.37250: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/AnsiballZ_command.py && sleep 0' 11792 1727096148.38431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096148.38549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.38573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.38876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.38907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.38987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.55625: stdout chunk (state=3): >>> {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-23 08:55:48.551775", "end": "2024-09-23 08:55:48.555018", "delta": "0:00:00.003243", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096148.57449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096148.57453: stdout chunk (state=3): >>><<< 11792 1727096148.57461: stderr chunk (state=3): >>><<< 11792 1727096148.57574: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-23 08:55:48.551775", "end": "2024-09-23 08:55:48.555018", "delta": "0:00:00.003243", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096148.57578: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/num_grat_arp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096148.57581: _low_level_execute_command(): starting 11792 1727096148.57583: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096148.2518616-13028-16962003861802/ > /dev/null 2>&1 && sleep 0' 11792 1727096148.58140: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096148.58144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.58165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096148.58171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.58182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.58234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.58243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.58260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.58289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.60206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.60229: stderr chunk (state=3): >>><<< 11792 1727096148.60232: stdout chunk (state=3): >>><<< 11792 1727096148.60247: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.60252: handler run complete 11792 1727096148.60276: Evaluated conditional (False): False 11792 1727096148.60386: variable 'bond_opt' from source: unknown 11792 1727096148.60389: variable 'result' from source: unknown 11792 1727096148.60400: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096148.60409: attempt loop complete, returning result 11792 1727096148.60423: variable 'bond_opt' from source: unknown 11792 1727096148.60477: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'num_grat_arp', 'value': '64'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "num_grat_arp", "value": "64" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/num_grat_arp" ], "delta": "0:00:00.003243", "end": "2024-09-23 08:55:48.555018", "rc": 0, "start": "2024-09-23 08:55:48.551775" } STDOUT: 64 11792 1727096148.60600: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.60603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.60606: variable 'omit' from source: magic vars 11792 1727096148.60697: variable 'ansible_distribution_major_version' from source: facts 11792 1727096148.60700: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096148.60705: variable 'omit' from source: magic vars 11792 1727096148.60722: variable 'omit' from source: magic vars 11792 1727096148.60827: variable 'controller_device' from source: play vars 11792 1727096148.60830: variable 'bond_opt' from source: unknown 11792 1727096148.60845: variable 'omit' from source: magic vars 11792 1727096148.60864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096148.60873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096148.60881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096148.60890: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096148.60893: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.60895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.60948: Set connection var ansible_timeout to 10 11792 1727096148.60951: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096148.60959: Set connection var ansible_shell_executable to /bin/sh 11792 1727096148.60964: Set connection var ansible_pipelining to False 11792 1727096148.60966: Set connection var ansible_shell_type to sh 11792 1727096148.60970: Set connection var ansible_connection to ssh 11792 1727096148.60985: variable 'ansible_shell_executable' from source: unknown 11792 1727096148.60988: variable 'ansible_connection' from source: unknown 11792 1727096148.60990: variable 'ansible_module_compression' from source: unknown 11792 1727096148.60992: variable 'ansible_shell_type' from source: unknown 11792 1727096148.60994: variable 'ansible_shell_executable' from source: unknown 11792 1727096148.60996: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.61001: variable 'ansible_pipelining' from source: unknown 11792 1727096148.61003: variable 'ansible_timeout' from source: unknown 11792 1727096148.61008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.61077: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096148.61084: variable 'omit' from source: magic vars 11792 1727096148.61087: starting attempt loop 11792 1727096148.61089: running the handler 11792 1727096148.61096: _low_level_execute_command(): starting 11792 1727096148.61099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096148.61565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.61572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096148.61574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.61576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.61578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096148.61580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.61623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.61634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.61688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.63403: stdout chunk (state=3): >>>/root <<< 11792 1727096148.63495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.63530: stderr chunk (state=3): >>><<< 11792 1727096148.63535: stdout chunk (state=3): >>><<< 11792 1727096148.63553: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.63563: _low_level_execute_command(): starting 11792 1727096148.63570: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454 `" && echo ansible-tmp-1727096148.6355467-13028-1836841351454="` echo /root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454 `" ) && sleep 0' 11792 1727096148.64020: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.64024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096148.64026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.64028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.64030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.64086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.64089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.64100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.64130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.66146: stdout chunk (state=3): >>>ansible-tmp-1727096148.6355467-13028-1836841351454=/root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454 <<< 11792 1727096148.66255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.66285: stderr chunk (state=3): >>><<< 11792 1727096148.66289: stdout chunk (state=3): >>><<< 11792 1727096148.66304: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096148.6355467-13028-1836841351454=/root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.66323: variable 'ansible_module_compression' from source: unknown 11792 1727096148.66356: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096148.66373: variable 'ansible_facts' from source: unknown 11792 1727096148.66414: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/AnsiballZ_command.py 11792 1727096148.66506: Sending initial data 11792 1727096148.66509: Sent initial data (154 bytes) 11792 1727096148.66938: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.66942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096148.66978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096148.66981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.66983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.66985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096148.66987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.67057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.67060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.67094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.68758: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096148.68790: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096148.68821: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp2r_6ekl2 /root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/AnsiballZ_command.py <<< 11792 1727096148.68829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/AnsiballZ_command.py" <<< 11792 1727096148.68861: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp2r_6ekl2" to remote "/root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/AnsiballZ_command.py" <<< 11792 1727096148.68863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/AnsiballZ_command.py" <<< 11792 1727096148.69377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.69423: stderr chunk (state=3): >>><<< 11792 1727096148.69427: stdout chunk (state=3): >>><<< 11792 1727096148.69475: done transferring module to remote 11792 1727096148.69482: _low_level_execute_command(): starting 11792 1727096148.69487: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/ /root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/AnsiballZ_command.py && sleep 0' 11792 1727096148.69926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.69930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096148.69962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.69965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.69969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.70021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.70024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.70027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.70072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.71965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.71993: stderr chunk (state=3): >>><<< 11792 1727096148.71996: stdout chunk (state=3): >>><<< 11792 1727096148.72010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.72013: _low_level_execute_command(): starting 11792 1727096148.72018: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/AnsiballZ_command.py && sleep 0' 11792 1727096148.72446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.72459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096148.72482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.72485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.72533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.72537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.72550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.72601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.88824: stdout chunk (state=3): >>> {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-23 08:55:48.883567", "end": "2024-09-23 08:55:48.886952", "delta": "0:00:00.003385", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096148.90512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096148.90538: stderr chunk (state=3): >>><<< 11792 1727096148.90542: stdout chunk (state=3): >>><<< 11792 1727096148.90558: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-23 08:55:48.883567", "end": "2024-09-23 08:55:48.886952", "delta": "0:00:00.003385", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096148.90582: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/resend_igmp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096148.90587: _low_level_execute_command(): starting 11792 1727096148.90589: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096148.6355467-13028-1836841351454/ > /dev/null 2>&1 && sleep 0' 11792 1727096148.91053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.91057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.91059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.91061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.91113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.91117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.91119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.91162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.93064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.93071: stdout chunk (state=3): >>><<< 11792 1727096148.93074: stderr chunk (state=3): >>><<< 11792 1727096148.93092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.93100: handler run complete 11792 1727096148.93113: Evaluated conditional (False): False 11792 1727096148.93224: variable 'bond_opt' from source: unknown 11792 1727096148.93228: variable 'result' from source: unknown 11792 1727096148.93239: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096148.93247: attempt loop complete, returning result 11792 1727096148.93264: variable 'bond_opt' from source: unknown 11792 1727096148.93315: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'resend_igmp', 'value': '225'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "resend_igmp", "value": "225" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/resend_igmp" ], "delta": "0:00:00.003385", "end": "2024-09-23 08:55:48.886952", "rc": 0, "start": "2024-09-23 08:55:48.883567" } STDOUT: 225 11792 1727096148.93440: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.93443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.93446: variable 'omit' from source: magic vars 11792 1727096148.93556: variable 'ansible_distribution_major_version' from source: facts 11792 1727096148.93564: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096148.93569: variable 'omit' from source: magic vars 11792 1727096148.93580: variable 'omit' from source: magic vars 11792 1727096148.93688: variable 'controller_device' from source: play vars 11792 1727096148.93691: variable 'bond_opt' from source: unknown 11792 1727096148.93705: variable 'omit' from source: magic vars 11792 1727096148.93721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096148.93727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096148.93733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096148.93743: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096148.93746: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.93748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.93804: Set connection var ansible_timeout to 10 11792 1727096148.93809: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096148.93817: Set connection var ansible_shell_executable to /bin/sh 11792 1727096148.93821: Set connection var ansible_pipelining to False 11792 1727096148.93824: Set connection var ansible_shell_type to sh 11792 1727096148.93826: Set connection var ansible_connection to ssh 11792 1727096148.93841: variable 'ansible_shell_executable' from source: unknown 11792 1727096148.93844: variable 'ansible_connection' from source: unknown 11792 1727096148.93846: variable 'ansible_module_compression' from source: unknown 11792 1727096148.93848: variable 'ansible_shell_type' from source: unknown 11792 1727096148.93851: variable 'ansible_shell_executable' from source: unknown 11792 1727096148.93853: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096148.93859: variable 'ansible_pipelining' from source: unknown 11792 1727096148.93862: variable 'ansible_timeout' from source: unknown 11792 1727096148.93865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096148.93949: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096148.93959: variable 'omit' from source: magic vars 11792 1727096148.93962: starting attempt loop 11792 1727096148.93964: running the handler 11792 1727096148.93972: _low_level_execute_command(): starting 11792 1727096148.93975: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096148.94407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.94411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.94423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.94473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.94485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.94531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.96203: stdout chunk (state=3): >>>/root <<< 11792 1727096148.96302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.96331: stderr chunk (state=3): >>><<< 11792 1727096148.96334: stdout chunk (state=3): >>><<< 11792 1727096148.96348: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.96361: _low_level_execute_command(): starting 11792 1727096148.96364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369 `" && echo ansible-tmp-1727096148.9634988-13028-134438468178369="` echo /root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369 `" ) && sleep 0' 11792 1727096148.96824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.96827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.96834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096148.96837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096148.96839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.96873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.96888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.96932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096148.98881: stdout chunk (state=3): >>>ansible-tmp-1727096148.9634988-13028-134438468178369=/root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369 <<< 11792 1727096148.98991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096148.99021: stderr chunk (state=3): >>><<< 11792 1727096148.99030: stdout chunk (state=3): >>><<< 11792 1727096148.99047: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096148.9634988-13028-134438468178369=/root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096148.99071: variable 'ansible_module_compression' from source: unknown 11792 1727096148.99101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096148.99116: variable 'ansible_facts' from source: unknown 11792 1727096148.99156: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/AnsiballZ_command.py 11792 1727096148.99251: Sending initial data 11792 1727096148.99258: Sent initial data (156 bytes) 11792 1727096148.99742: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.99747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096148.99750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096148.99752: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096148.99754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096148.99808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096148.99813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096148.99815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096148.99849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.01526: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096149.01582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096149.01682: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpv3jojs_m /root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/AnsiballZ_command.py <<< 11792 1727096149.01687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/AnsiballZ_command.py" <<< 11792 1727096149.01760: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpv3jojs_m" to remote "/root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/AnsiballZ_command.py" <<< 11792 1727096149.02415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.02463: stderr chunk (state=3): >>><<< 11792 1727096149.02467: stdout chunk (state=3): >>><<< 11792 1727096149.02492: done transferring module to remote 11792 1727096149.02500: _low_level_execute_command(): starting 11792 1727096149.02505: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/ /root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/AnsiballZ_command.py && sleep 0' 11792 1727096149.02930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096149.02970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096149.02974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096149.02977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.02979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096149.02985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096149.02987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.03028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.03032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.03072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.05084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.05090: stdout chunk (state=3): >>><<< 11792 1727096149.05092: stderr chunk (state=3): >>><<< 11792 1727096149.05095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.05097: _low_level_execute_command(): starting 11792 1727096149.05099: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/AnsiballZ_command.py && sleep 0' 11792 1727096149.05624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096149.05632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096149.05644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096149.05661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096149.05763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096149.05767: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096149.05772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.05774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096149.05776: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096149.05778: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096149.05780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096149.05781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096149.05783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096149.05785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096149.05787: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096149.05789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.05839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.05852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.05877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.05961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.22309: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-23 08:55:49.217736", "end": "2024-09-23 08:55:49.221013", "delta": "0:00:00.003277", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096149.23996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096149.24275: stderr chunk (state=3): >>><<< 11792 1727096149.24279: stdout chunk (state=3): >>><<< 11792 1727096149.24282: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-23 08:55:49.217736", "end": "2024-09-23 08:55:49.221013", "delta": "0:00:00.003277", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096149.24285: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/updelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096149.24292: _low_level_execute_command(): starting 11792 1727096149.24294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096148.9634988-13028-134438468178369/ > /dev/null 2>&1 && sleep 0' 11792 1727096149.25517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096149.25532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.25800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.25930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.25961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.27875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.27905: stderr chunk (state=3): >>><<< 11792 1727096149.27916: stdout chunk (state=3): >>><<< 11792 1727096149.28141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.28145: handler run complete 11792 1727096149.28147: Evaluated conditional (False): False 11792 1727096149.28348: variable 'bond_opt' from source: unknown 11792 1727096149.28571: variable 'result' from source: unknown 11792 1727096149.28574: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096149.28577: attempt loop complete, returning result 11792 1727096149.28579: variable 'bond_opt' from source: unknown 11792 1727096149.28691: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'updelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "updelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/updelay" ], "delta": "0:00:00.003277", "end": "2024-09-23 08:55:49.221013", "rc": 0, "start": "2024-09-23 08:55:49.217736" } STDOUT: 0 11792 1727096149.29073: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096149.29076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096149.29079: variable 'omit' from source: magic vars 11792 1727096149.29491: variable 'ansible_distribution_major_version' from source: facts 11792 1727096149.29495: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096149.29497: variable 'omit' from source: magic vars 11792 1727096149.29499: variable 'omit' from source: magic vars 11792 1727096149.29670: variable 'controller_device' from source: play vars 11792 1727096149.29928: variable 'bond_opt' from source: unknown 11792 1727096149.29931: variable 'omit' from source: magic vars 11792 1727096149.29934: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096149.29936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096149.29938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096149.29940: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096149.29942: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096149.29944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096149.30029: Set connection var ansible_timeout to 10 11792 1727096149.30259: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096149.30262: Set connection var ansible_shell_executable to /bin/sh 11792 1727096149.30264: Set connection var ansible_pipelining to False 11792 1727096149.30266: Set connection var ansible_shell_type to sh 11792 1727096149.30271: Set connection var ansible_connection to ssh 11792 1727096149.30273: variable 'ansible_shell_executable' from source: unknown 11792 1727096149.30275: variable 'ansible_connection' from source: unknown 11792 1727096149.30276: variable 'ansible_module_compression' from source: unknown 11792 1727096149.30278: variable 'ansible_shell_type' from source: unknown 11792 1727096149.30280: variable 'ansible_shell_executable' from source: unknown 11792 1727096149.30282: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096149.30284: variable 'ansible_pipelining' from source: unknown 11792 1727096149.30286: variable 'ansible_timeout' from source: unknown 11792 1727096149.30288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096149.30432: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096149.30680: variable 'omit' from source: magic vars 11792 1727096149.30684: starting attempt loop 11792 1727096149.30689: running the handler 11792 1727096149.30694: _low_level_execute_command(): starting 11792 1727096149.30697: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096149.31690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096149.31790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.31896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.31920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.32011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.33671: stdout chunk (state=3): >>>/root <<< 11792 1727096149.33754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.34074: stderr chunk (state=3): >>><<< 11792 1727096149.34077: stdout chunk (state=3): >>><<< 11792 1727096149.34080: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.34082: _low_level_execute_command(): starting 11792 1727096149.34084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768 `" && echo ansible-tmp-1727096149.3399723-13028-57175195671768="` echo /root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768 `" ) && sleep 0' 11792 1727096149.35281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096149.35295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.35348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.35495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.35551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.37593: stdout chunk (state=3): >>>ansible-tmp-1727096149.3399723-13028-57175195671768=/root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768 <<< 11792 1727096149.37690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.37724: stderr chunk (state=3): >>><<< 11792 1727096149.37732: stdout chunk (state=3): >>><<< 11792 1727096149.37806: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096149.3399723-13028-57175195671768=/root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.37837: variable 'ansible_module_compression' from source: unknown 11792 1727096149.38075: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096149.38078: variable 'ansible_facts' from source: unknown 11792 1727096149.38216: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/AnsiballZ_command.py 11792 1727096149.38655: Sending initial data 11792 1727096149.38666: Sent initial data (155 bytes) 11792 1727096149.39902: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.40035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.40137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.40215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.42156: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096149.42258: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpd3wuq1yb /root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/AnsiballZ_command.py <<< 11792 1727096149.42262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/AnsiballZ_command.py" <<< 11792 1727096149.42537: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpd3wuq1yb" to remote "/root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/AnsiballZ_command.py" <<< 11792 1727096149.44480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.44493: stdout chunk (state=3): >>><<< 11792 1727096149.44511: stderr chunk (state=3): >>><<< 11792 1727096149.44717: done transferring module to remote 11792 1727096149.44724: _low_level_execute_command(): starting 11792 1727096149.44726: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/ /root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/AnsiballZ_command.py && sleep 0' 11792 1727096149.45977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.46109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.46374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.48192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.48222: stderr chunk (state=3): >>><<< 11792 1727096149.48225: stdout chunk (state=3): >>><<< 11792 1727096149.48248: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.48251: _low_level_execute_command(): starting 11792 1727096149.48256: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/AnsiballZ_command.py && sleep 0' 11792 1727096149.49792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.49819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.49833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.50098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.50176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.66507: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-23 08:55:49.660497", "end": "2024-09-23 08:55:49.663653", "delta": "0:00:00.003156", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096149.68338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096149.68343: stdout chunk (state=3): >>><<< 11792 1727096149.68348: stderr chunk (state=3): >>><<< 11792 1727096149.68372: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-23 08:55:49.660497", "end": "2024-09-23 08:55:49.663653", "delta": "0:00:00.003156", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096149.68400: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/use_carrier', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096149.68405: _low_level_execute_command(): starting 11792 1727096149.68410: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096149.3399723-13028-57175195671768/ > /dev/null 2>&1 && sleep 0' 11792 1727096149.69688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.69710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.69810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.69813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.69875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.71766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.71772: stderr chunk (state=3): >>><<< 11792 1727096149.71775: stdout chunk (state=3): >>><<< 11792 1727096149.71792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.71795: handler run complete 11792 1727096149.71818: Evaluated conditional (False): False 11792 1727096149.71971: variable 'bond_opt' from source: unknown 11792 1727096149.72182: variable 'result' from source: unknown 11792 1727096149.72198: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096149.72206: attempt loop complete, returning result 11792 1727096149.72225: variable 'bond_opt' from source: unknown 11792 1727096149.72297: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'use_carrier', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "use_carrier", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/use_carrier" ], "delta": "0:00:00.003156", "end": "2024-09-23 08:55:49.663653", "rc": 0, "start": "2024-09-23 08:55:49.660497" } STDOUT: 1 11792 1727096149.72608: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096149.72654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096149.72657: variable 'omit' from source: magic vars 11792 1727096149.72981: variable 'ansible_distribution_major_version' from source: facts 11792 1727096149.72985: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096149.72990: variable 'omit' from source: magic vars 11792 1727096149.73005: variable 'omit' from source: magic vars 11792 1727096149.73169: variable 'controller_device' from source: play vars 11792 1727096149.73377: variable 'bond_opt' from source: unknown 11792 1727096149.73396: variable 'omit' from source: magic vars 11792 1727096149.73416: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096149.73425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096149.73432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096149.73446: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096149.73449: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096149.73451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096149.73529: Set connection var ansible_timeout to 10 11792 1727096149.73537: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096149.73546: Set connection var ansible_shell_executable to /bin/sh 11792 1727096149.73551: Set connection var ansible_pipelining to False 11792 1727096149.73554: Set connection var ansible_shell_type to sh 11792 1727096149.73560: Set connection var ansible_connection to ssh 11792 1727096149.73784: variable 'ansible_shell_executable' from source: unknown 11792 1727096149.73787: variable 'ansible_connection' from source: unknown 11792 1727096149.73790: variable 'ansible_module_compression' from source: unknown 11792 1727096149.73792: variable 'ansible_shell_type' from source: unknown 11792 1727096149.73794: variable 'ansible_shell_executable' from source: unknown 11792 1727096149.73797: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096149.73851: variable 'ansible_pipelining' from source: unknown 11792 1727096149.73854: variable 'ansible_timeout' from source: unknown 11792 1727096149.73856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096149.73903: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096149.73911: variable 'omit' from source: magic vars 11792 1727096149.73914: starting attempt loop 11792 1727096149.73916: running the handler 11792 1727096149.73923: _low_level_execute_command(): starting 11792 1727096149.73926: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096149.75307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096149.75610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.75614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.75700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.77378: stdout chunk (state=3): >>>/root <<< 11792 1727096149.77475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.77522: stderr chunk (state=3): >>><<< 11792 1727096149.77525: stdout chunk (state=3): >>><<< 11792 1727096149.77625: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.77629: _low_level_execute_command(): starting 11792 1727096149.77631: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597 `" && echo ansible-tmp-1727096149.7754176-13028-146992189375597="` echo /root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597 `" ) && sleep 0' 11792 1727096149.78806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096149.78851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.78883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.78959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.79083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.79143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.81200: stdout chunk (state=3): >>>ansible-tmp-1727096149.7754176-13028-146992189375597=/root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597 <<< 11792 1727096149.81346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.81361: stdout chunk (state=3): >>><<< 11792 1727096149.81378: stderr chunk (state=3): >>><<< 11792 1727096149.81777: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096149.7754176-13028-146992189375597=/root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.81781: variable 'ansible_module_compression' from source: unknown 11792 1727096149.81783: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096149.81785: variable 'ansible_facts' from source: unknown 11792 1727096149.81787: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/AnsiballZ_command.py 11792 1727096149.82196: Sending initial data 11792 1727096149.82200: Sent initial data (156 bytes) 11792 1727096149.83399: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096149.83494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096149.83503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.83693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.85365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096149.85525: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096149.85589: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmppc5wv8gp /root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/AnsiballZ_command.py <<< 11792 1727096149.85592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/AnsiballZ_command.py" <<< 11792 1727096149.85641: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmppc5wv8gp" to remote "/root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/AnsiballZ_command.py" <<< 11792 1727096149.86748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.86805: stderr chunk (state=3): >>><<< 11792 1727096149.86813: stdout chunk (state=3): >>><<< 11792 1727096149.86866: done transferring module to remote 11792 1727096149.86946: _low_level_execute_command(): starting 11792 1727096149.86979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/ /root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/AnsiballZ_command.py && sleep 0' 11792 1727096149.87967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096149.87973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.87976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096149.87978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096149.87980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096149.88284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.88512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096149.90271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096149.90310: stderr chunk (state=3): >>><<< 11792 1727096149.90314: stdout chunk (state=3): >>><<< 11792 1727096149.90586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096149.90590: _low_level_execute_command(): starting 11792 1727096149.90593: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/AnsiballZ_command.py && sleep 0' 11792 1727096149.91691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096149.91775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096149.91853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096149.92093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096149.92252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096150.08554: stdout chunk (state=3): >>> {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-23 08:55:50.079870", "end": "2024-09-23 08:55:50.083306", "delta": "0:00:00.003436", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096150.10164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096150.10172: stdout chunk (state=3): >>><<< 11792 1727096150.10179: stderr chunk (state=3): >>><<< 11792 1727096150.10380: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-23 08:55:50.079870", "end": "2024-09-23 08:55:50.083306", "delta": "0:00:00.003436", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096150.10407: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/xmit_hash_policy', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096150.10412: _low_level_execute_command(): starting 11792 1727096150.10420: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096149.7754176-13028-146992189375597/ > /dev/null 2>&1 && sleep 0' 11792 1727096150.11701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096150.11764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096150.11833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096150.11900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096150.11949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096150.13886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096150.13891: stdout chunk (state=3): >>><<< 11792 1727096150.14074: stderr chunk (state=3): >>><<< 11792 1727096150.14078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096150.14081: handler run complete 11792 1727096150.14083: Evaluated conditional (False): False 11792 1727096150.14102: variable 'bond_opt' from source: unknown 11792 1727096150.14109: variable 'result' from source: unknown 11792 1727096150.14128: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096150.14139: attempt loop complete, returning result 11792 1727096150.14159: variable 'bond_opt' from source: unknown 11792 1727096150.14222: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'xmit_hash_policy', 'value': 'encap2+3'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "xmit_hash_policy", "value": "encap2+3" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy" ], "delta": "0:00:00.003436", "end": "2024-09-23 08:55:50.083306", "rc": 0, "start": "2024-09-23 08:55:50.079870" } STDOUT: encap2+3 3 11792 1727096150.14366: dumping result to json 11792 1727096150.14372: done dumping result, returning 11792 1727096150.14375: done running TaskExecutor() for managed_node2/TASK: ** TEST check bond settings [0afff68d-5257-d9c7-3fc0-000000000400] 11792 1727096150.14378: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000400 11792 1727096150.15354: no more pending results, returning what we have 11792 1727096150.15358: results queue empty 11792 1727096150.15359: checking for any_errors_fatal 11792 1727096150.15365: done checking for any_errors_fatal 11792 1727096150.15366: checking for max_fail_percentage 11792 1727096150.15369: done checking for max_fail_percentage 11792 1727096150.15370: checking to see if all hosts have failed and the running result is not ok 11792 1727096150.15371: done checking to see if all hosts have failed 11792 1727096150.15372: getting the remaining hosts for this loop 11792 1727096150.15373: done getting the remaining hosts for this loop 11792 1727096150.15376: getting the next task for host managed_node2 11792 1727096150.15383: done getting next task for host managed_node2 11792 1727096150.15385: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 11792 1727096150.15388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096150.15436: getting variables 11792 1727096150.15437: in VariableManager get_vars() 11792 1727096150.15465: Calling all_inventory to load vars for managed_node2 11792 1727096150.15572: Calling groups_inventory to load vars for managed_node2 11792 1727096150.15576: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096150.15582: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000400 11792 1727096150.15586: WORKER PROCESS EXITING 11792 1727096150.15596: Calling all_plugins_play to load vars for managed_node2 11792 1727096150.15599: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096150.15603: Calling groups_plugins_play to load vars for managed_node2 11792 1727096150.17179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096150.18872: done with get_vars() 11792 1727096150.18902: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Monday 23 September 2024 08:55:50 -0400 (0:00:05.911) 0:00:32.469 ****** 11792 1727096150.19006: entering _queue_task() for managed_node2/include_tasks 11792 1727096150.19504: worker is 1 (out of 1 available) 11792 1727096150.19513: exiting _queue_task() for managed_node2/include_tasks 11792 1727096150.19526: done queuing things up, now waiting for results queue to drain 11792 1727096150.19527: waiting for pending results... 11792 1727096150.19728: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv4_present.yml' 11792 1727096150.19874: in run() - task 0afff68d-5257-d9c7-3fc0-000000000402 11792 1727096150.19894: variable 'ansible_search_path' from source: unknown 11792 1727096150.19902: variable 'ansible_search_path' from source: unknown 11792 1727096150.19942: calling self._execute() 11792 1727096150.20056: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096150.20077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096150.20091: variable 'omit' from source: magic vars 11792 1727096150.20472: variable 'ansible_distribution_major_version' from source: facts 11792 1727096150.20490: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096150.20508: _execute() done 11792 1727096150.20516: dumping result to json 11792 1727096150.20523: done dumping result, returning 11792 1727096150.20532: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv4_present.yml' [0afff68d-5257-d9c7-3fc0-000000000402] 11792 1727096150.20541: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000402 11792 1727096150.20689: no more pending results, returning what we have 11792 1727096150.20694: in VariableManager get_vars() 11792 1727096150.20732: Calling all_inventory to load vars for managed_node2 11792 1727096150.20736: Calling groups_inventory to load vars for managed_node2 11792 1727096150.20739: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096150.20756: Calling all_plugins_play to load vars for managed_node2 11792 1727096150.20760: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096150.20763: Calling groups_plugins_play to load vars for managed_node2 11792 1727096150.21601: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000402 11792 1727096150.21605: WORKER PROCESS EXITING 11792 1727096150.22655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096150.24434: done with get_vars() 11792 1727096150.24460: variable 'ansible_search_path' from source: unknown 11792 1727096150.24462: variable 'ansible_search_path' from source: unknown 11792 1727096150.24479: variable 'item' from source: include params 11792 1727096150.24609: variable 'item' from source: include params 11792 1727096150.24644: we have included files to process 11792 1727096150.24646: generating all_blocks data 11792 1727096150.24648: done generating all_blocks data 11792 1727096150.24655: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11792 1727096150.24656: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11792 1727096150.24658: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11792 1727096150.24932: done processing included file 11792 1727096150.24935: iterating over new_blocks loaded from include file 11792 1727096150.24936: in VariableManager get_vars() 11792 1727096150.24958: done with get_vars() 11792 1727096150.24963: filtering new block on tags 11792 1727096150.25000: done filtering new block on tags 11792 1727096150.25003: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node2 11792 1727096150.25009: extending task lists for all hosts with included blocks 11792 1727096150.25265: done extending task lists 11792 1727096150.25266: done processing included files 11792 1727096150.25269: results queue empty 11792 1727096150.25270: checking for any_errors_fatal 11792 1727096150.25289: done checking for any_errors_fatal 11792 1727096150.25290: checking for max_fail_percentage 11792 1727096150.25292: done checking for max_fail_percentage 11792 1727096150.25293: checking to see if all hosts have failed and the running result is not ok 11792 1727096150.25293: done checking to see if all hosts have failed 11792 1727096150.25294: getting the remaining hosts for this loop 11792 1727096150.25295: done getting the remaining hosts for this loop 11792 1727096150.25298: getting the next task for host managed_node2 11792 1727096150.25303: done getting next task for host managed_node2 11792 1727096150.25305: ^ task is: TASK: ** TEST check IPv4 11792 1727096150.25308: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096150.25313: getting variables 11792 1727096150.25314: in VariableManager get_vars() 11792 1727096150.25324: Calling all_inventory to load vars for managed_node2 11792 1727096150.25326: Calling groups_inventory to load vars for managed_node2 11792 1727096150.25328: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096150.25334: Calling all_plugins_play to load vars for managed_node2 11792 1727096150.25336: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096150.25345: Calling groups_plugins_play to load vars for managed_node2 11792 1727096150.26695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096150.28523: done with get_vars() 11792 1727096150.28554: done getting variables 11792 1727096150.28604: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Monday 23 September 2024 08:55:50 -0400 (0:00:00.096) 0:00:32.565 ****** 11792 1727096150.28644: entering _queue_task() for managed_node2/command 11792 1727096150.29059: worker is 1 (out of 1 available) 11792 1727096150.29177: exiting _queue_task() for managed_node2/command 11792 1727096150.29190: done queuing things up, now waiting for results queue to drain 11792 1727096150.29191: waiting for pending results... 11792 1727096150.29425: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 11792 1727096150.29580: in run() - task 0afff68d-5257-d9c7-3fc0-000000000631 11792 1727096150.29612: variable 'ansible_search_path' from source: unknown 11792 1727096150.29624: variable 'ansible_search_path' from source: unknown 11792 1727096150.29677: calling self._execute() 11792 1727096150.29781: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096150.29794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096150.29808: variable 'omit' from source: magic vars 11792 1727096150.30247: variable 'ansible_distribution_major_version' from source: facts 11792 1727096150.30306: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096150.30310: variable 'omit' from source: magic vars 11792 1727096150.30353: variable 'omit' from source: magic vars 11792 1727096150.30538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096150.33058: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096150.33274: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096150.33278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096150.33281: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096150.33283: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096150.33350: variable 'interface' from source: include params 11792 1727096150.33363: variable 'controller_device' from source: play vars 11792 1727096150.33445: variable 'controller_device' from source: play vars 11792 1727096150.33478: variable 'omit' from source: magic vars 11792 1727096150.33528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096150.33563: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096150.33596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096150.33629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096150.33646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096150.33731: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096150.33734: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096150.33737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096150.33813: Set connection var ansible_timeout to 10 11792 1727096150.33829: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096150.33858: Set connection var ansible_shell_executable to /bin/sh 11792 1727096150.33875: Set connection var ansible_pipelining to False 11792 1727096150.33883: Set connection var ansible_shell_type to sh 11792 1727096150.33891: Set connection var ansible_connection to ssh 11792 1727096150.33948: variable 'ansible_shell_executable' from source: unknown 11792 1727096150.33951: variable 'ansible_connection' from source: unknown 11792 1727096150.33954: variable 'ansible_module_compression' from source: unknown 11792 1727096150.33956: variable 'ansible_shell_type' from source: unknown 11792 1727096150.33958: variable 'ansible_shell_executable' from source: unknown 11792 1727096150.33960: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096150.33962: variable 'ansible_pipelining' from source: unknown 11792 1727096150.33964: variable 'ansible_timeout' from source: unknown 11792 1727096150.33966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096150.34170: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096150.34174: variable 'omit' from source: magic vars 11792 1727096150.34177: starting attempt loop 11792 1727096150.34180: running the handler 11792 1727096150.34182: _low_level_execute_command(): starting 11792 1727096150.34185: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096150.34985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096150.35058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096150.35101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096150.35314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096150.36930: stdout chunk (state=3): >>>/root <<< 11792 1727096150.37166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096150.37174: stdout chunk (state=3): >>><<< 11792 1727096150.37176: stderr chunk (state=3): >>><<< 11792 1727096150.37210: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096150.37484: _low_level_execute_command(): starting 11792 1727096150.37490: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479 `" && echo ansible-tmp-1727096150.3737621-13389-261861649792479="` echo /root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479 `" ) && sleep 0' 11792 1727096150.38692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096150.38789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096150.38802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096150.38830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096150.39005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096150.41112: stdout chunk (state=3): >>>ansible-tmp-1727096150.3737621-13389-261861649792479=/root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479 <<< 11792 1727096150.41172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096150.41226: stderr chunk (state=3): >>><<< 11792 1727096150.41230: stdout chunk (state=3): >>><<< 11792 1727096150.41430: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096150.3737621-13389-261861649792479=/root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096150.41434: variable 'ansible_module_compression' from source: unknown 11792 1727096150.41436: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096150.41438: variable 'ansible_facts' from source: unknown 11792 1727096150.41695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/AnsiballZ_command.py 11792 1727096150.42224: Sending initial data 11792 1727096150.42227: Sent initial data (156 bytes) 11792 1727096150.43219: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096150.43229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096150.43242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096150.43577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096150.43584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096150.43587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096150.43660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096150.45315: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096150.45348: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096150.45383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096150.45651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmphk2p42f0 /root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/AnsiballZ_command.py <<< 11792 1727096150.45655: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmphk2p42f0" to remote "/root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/AnsiballZ_command.py" <<< 11792 1727096150.47134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096150.47174: stderr chunk (state=3): >>><<< 11792 1727096150.47177: stdout chunk (state=3): >>><<< 11792 1727096150.47236: done transferring module to remote 11792 1727096150.47247: _low_level_execute_command(): starting 11792 1727096150.47254: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/ /root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/AnsiballZ_command.py && sleep 0' 11792 1727096150.48777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096150.48782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096150.48785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096150.48787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096150.48790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096150.48798: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096150.48800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096150.48803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096150.48805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096150.48806: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096150.48808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096150.48992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096150.49132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096150.49202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096150.51136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096150.51164: stderr chunk (state=3): >>><<< 11792 1727096150.51285: stdout chunk (state=3): >>><<< 11792 1727096150.51288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096150.51291: _low_level_execute_command(): starting 11792 1727096150.51293: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/AnsiballZ_command.py && sleep 0' 11792 1727096150.52504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096150.52606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096150.52702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096150.52727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096150.52842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096150.69165: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.3/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 228sec preferred_lft 228sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-23 08:55:50.686214", "end": "2024-09-23 08:55:50.690150", "delta": "0:00:00.003936", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096150.70928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096150.70932: stdout chunk (state=3): >>><<< 11792 1727096150.70935: stderr chunk (state=3): >>><<< 11792 1727096150.70937: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.3/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 228sec preferred_lft 228sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-23 08:55:50.686214", "end": "2024-09-23 08:55:50.690150", "delta": "0:00:00.003936", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096150.71180: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096150.71189: _low_level_execute_command(): starting 11792 1727096150.71191: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096150.3737621-13389-261861649792479/ > /dev/null 2>&1 && sleep 0' 11792 1727096150.72028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096150.72038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096150.72049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096150.72070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096150.72084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096150.72092: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096150.72101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096150.72193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096150.72212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096150.72286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096150.74238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096150.74242: stdout chunk (state=3): >>><<< 11792 1727096150.74248: stderr chunk (state=3): >>><<< 11792 1727096150.74389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096150.74393: handler run complete 11792 1727096150.74418: Evaluated conditional (False): False 11792 1727096150.74806: variable 'address' from source: include params 11792 1727096150.74809: variable 'result' from source: set_fact 11792 1727096150.74827: Evaluated conditional (address in result.stdout): True 11792 1727096150.74840: attempt loop complete, returning result 11792 1727096150.74843: _execute() done 11792 1727096150.74845: dumping result to json 11792 1727096150.74872: done dumping result, returning 11792 1727096150.74875: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 [0afff68d-5257-d9c7-3fc0-000000000631] 11792 1727096150.74885: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000631 11792 1727096150.74965: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000631 11792 1727096150.74970: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003936", "end": "2024-09-23 08:55:50.690150", "rc": 0, "start": "2024-09-23 08:55:50.686214" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.3/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 228sec preferred_lft 228sec 11792 1727096150.75071: no more pending results, returning what we have 11792 1727096150.75075: results queue empty 11792 1727096150.75076: checking for any_errors_fatal 11792 1727096150.75078: done checking for any_errors_fatal 11792 1727096150.75079: checking for max_fail_percentage 11792 1727096150.75081: done checking for max_fail_percentage 11792 1727096150.75082: checking to see if all hosts have failed and the running result is not ok 11792 1727096150.75082: done checking to see if all hosts have failed 11792 1727096150.75083: getting the remaining hosts for this loop 11792 1727096150.75084: done getting the remaining hosts for this loop 11792 1727096150.75089: getting the next task for host managed_node2 11792 1727096150.75099: done getting next task for host managed_node2 11792 1727096150.75102: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 11792 1727096150.75106: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096150.75111: getting variables 11792 1727096150.75112: in VariableManager get_vars() 11792 1727096150.75147: Calling all_inventory to load vars for managed_node2 11792 1727096150.75151: Calling groups_inventory to load vars for managed_node2 11792 1727096150.75155: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096150.75370: Calling all_plugins_play to load vars for managed_node2 11792 1727096150.75376: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096150.75384: Calling groups_plugins_play to load vars for managed_node2 11792 1727096150.78252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096150.80265: done with get_vars() 11792 1727096150.80305: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Monday 23 September 2024 08:55:50 -0400 (0:00:00.517) 0:00:33.083 ****** 11792 1727096150.80407: entering _queue_task() for managed_node2/include_tasks 11792 1727096150.80934: worker is 1 (out of 1 available) 11792 1727096150.80946: exiting _queue_task() for managed_node2/include_tasks 11792 1727096150.80961: done queuing things up, now waiting for results queue to drain 11792 1727096150.80963: waiting for pending results... 11792 1727096150.81345: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv6_present.yml' 11792 1727096150.81443: in run() - task 0afff68d-5257-d9c7-3fc0-000000000403 11792 1727096150.81447: variable 'ansible_search_path' from source: unknown 11792 1727096150.81454: variable 'ansible_search_path' from source: unknown 11792 1727096150.81478: calling self._execute() 11792 1727096150.81591: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096150.81603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096150.81616: variable 'omit' from source: magic vars 11792 1727096150.82019: variable 'ansible_distribution_major_version' from source: facts 11792 1727096150.82096: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096150.82099: _execute() done 11792 1727096150.82103: dumping result to json 11792 1727096150.82105: done dumping result, returning 11792 1727096150.82107: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv6_present.yml' [0afff68d-5257-d9c7-3fc0-000000000403] 11792 1727096150.82109: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000403 11792 1727096150.82233: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000403 11792 1727096150.82236: WORKER PROCESS EXITING 11792 1727096150.82308: no more pending results, returning what we have 11792 1727096150.82570: in VariableManager get_vars() 11792 1727096150.82605: Calling all_inventory to load vars for managed_node2 11792 1727096150.82608: Calling groups_inventory to load vars for managed_node2 11792 1727096150.82611: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096150.82622: Calling all_plugins_play to load vars for managed_node2 11792 1727096150.82625: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096150.82629: Calling groups_plugins_play to load vars for managed_node2 11792 1727096150.84090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096150.85793: done with get_vars() 11792 1727096150.85829: variable 'ansible_search_path' from source: unknown 11792 1727096150.85831: variable 'ansible_search_path' from source: unknown 11792 1727096150.85842: variable 'item' from source: include params 11792 1727096150.85960: variable 'item' from source: include params 11792 1727096150.85997: we have included files to process 11792 1727096150.85998: generating all_blocks data 11792 1727096150.86000: done generating all_blocks data 11792 1727096150.86006: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11792 1727096150.86008: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11792 1727096150.86010: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11792 1727096150.86270: done processing included file 11792 1727096150.86273: iterating over new_blocks loaded from include file 11792 1727096150.86274: in VariableManager get_vars() 11792 1727096150.86292: done with get_vars() 11792 1727096150.86294: filtering new block on tags 11792 1727096150.86322: done filtering new block on tags 11792 1727096150.86325: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node2 11792 1727096150.86331: extending task lists for all hosts with included blocks 11792 1727096150.86705: done extending task lists 11792 1727096150.86707: done processing included files 11792 1727096150.86708: results queue empty 11792 1727096150.86708: checking for any_errors_fatal 11792 1727096150.86713: done checking for any_errors_fatal 11792 1727096150.86714: checking for max_fail_percentage 11792 1727096150.86715: done checking for max_fail_percentage 11792 1727096150.86716: checking to see if all hosts have failed and the running result is not ok 11792 1727096150.86717: done checking to see if all hosts have failed 11792 1727096150.86717: getting the remaining hosts for this loop 11792 1727096150.86718: done getting the remaining hosts for this loop 11792 1727096150.86721: getting the next task for host managed_node2 11792 1727096150.86726: done getting next task for host managed_node2 11792 1727096150.86728: ^ task is: TASK: ** TEST check IPv6 11792 1727096150.86732: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096150.86734: getting variables 11792 1727096150.86735: in VariableManager get_vars() 11792 1727096150.86747: Calling all_inventory to load vars for managed_node2 11792 1727096150.86750: Calling groups_inventory to load vars for managed_node2 11792 1727096150.86754: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096150.86760: Calling all_plugins_play to load vars for managed_node2 11792 1727096150.86762: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096150.86765: Calling groups_plugins_play to load vars for managed_node2 11792 1727096150.88102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096150.90845: done with get_vars() 11792 1727096150.90894: done getting variables 11792 1727096150.90947: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Monday 23 September 2024 08:55:50 -0400 (0:00:00.105) 0:00:33.189 ****** 11792 1727096150.90984: entering _queue_task() for managed_node2/command 11792 1727096150.91380: worker is 1 (out of 1 available) 11792 1727096150.91393: exiting _queue_task() for managed_node2/command 11792 1727096150.91407: done queuing things up, now waiting for results queue to drain 11792 1727096150.91409: waiting for pending results... 11792 1727096150.92171: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 11792 1727096150.92224: in run() - task 0afff68d-5257-d9c7-3fc0-000000000652 11792 1727096150.92251: variable 'ansible_search_path' from source: unknown 11792 1727096150.92372: variable 'ansible_search_path' from source: unknown 11792 1727096150.92376: calling self._execute() 11792 1727096150.92621: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096150.92709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096150.92724: variable 'omit' from source: magic vars 11792 1727096150.93818: variable 'ansible_distribution_major_version' from source: facts 11792 1727096150.93823: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096150.93825: variable 'omit' from source: magic vars 11792 1727096150.93829: variable 'omit' from source: magic vars 11792 1727096150.94397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096150.98990: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096150.99176: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096150.99294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096150.99333: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096150.99461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096150.99551: variable 'controller_device' from source: play vars 11792 1727096150.99717: variable 'omit' from source: magic vars 11792 1727096150.99741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096150.99777: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096150.99935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096150.99938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096150.99940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096151.00021: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096151.00030: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096151.00043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096151.00272: Set connection var ansible_timeout to 10 11792 1727096151.00288: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096151.00370: Set connection var ansible_shell_executable to /bin/sh 11792 1727096151.00373: Set connection var ansible_pipelining to False 11792 1727096151.00375: Set connection var ansible_shell_type to sh 11792 1727096151.00378: Set connection var ansible_connection to ssh 11792 1727096151.00474: variable 'ansible_shell_executable' from source: unknown 11792 1727096151.00477: variable 'ansible_connection' from source: unknown 11792 1727096151.00479: variable 'ansible_module_compression' from source: unknown 11792 1727096151.00482: variable 'ansible_shell_type' from source: unknown 11792 1727096151.00484: variable 'ansible_shell_executable' from source: unknown 11792 1727096151.00486: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096151.00488: variable 'ansible_pipelining' from source: unknown 11792 1727096151.00490: variable 'ansible_timeout' from source: unknown 11792 1727096151.00583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096151.00723: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096151.00780: variable 'omit' from source: magic vars 11792 1727096151.00791: starting attempt loop 11792 1727096151.00803: running the handler 11792 1727096151.00931: _low_level_execute_command(): starting 11792 1727096151.01018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096151.02301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096151.02443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096151.02574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096151.02793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096151.02877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096151.04735: stdout chunk (state=3): >>>/root <<< 11792 1727096151.04864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096151.04894: stderr chunk (state=3): >>><<< 11792 1727096151.04897: stdout chunk (state=3): >>><<< 11792 1727096151.04917: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096151.04952: _low_level_execute_command(): starting 11792 1727096151.04983: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723 `" && echo ansible-tmp-1727096151.0493836-13421-181385549676723="` echo /root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723 `" ) && sleep 0' 11792 1727096151.06272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096151.06456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096151.06565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096151.06674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096151.08645: stdout chunk (state=3): >>>ansible-tmp-1727096151.0493836-13421-181385549676723=/root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723 <<< 11792 1727096151.08784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096151.08860: stderr chunk (state=3): >>><<< 11792 1727096151.08879: stdout chunk (state=3): >>><<< 11792 1727096151.08973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096151.0493836-13421-181385549676723=/root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096151.09010: variable 'ansible_module_compression' from source: unknown 11792 1727096151.09276: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096151.09280: variable 'ansible_facts' from source: unknown 11792 1727096151.09354: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/AnsiballZ_command.py 11792 1727096151.09643: Sending initial data 11792 1727096151.09717: Sent initial data (156 bytes) 11792 1727096151.11052: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096151.11074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096151.11154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096151.11344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096151.11371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096151.11411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096151.11453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096151.13133: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096151.13268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096151.13277: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/AnsiballZ_command.py" <<< 11792 1727096151.13280: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp4nqbjfoa /root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/AnsiballZ_command.py <<< 11792 1727096151.13776: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp4nqbjfoa" to remote "/root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/AnsiballZ_command.py" <<< 11792 1727096151.14646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096151.14664: stdout chunk (state=3): >>><<< 11792 1727096151.14681: stderr chunk (state=3): >>><<< 11792 1727096151.14720: done transferring module to remote 11792 1727096151.14789: _low_level_execute_command(): starting 11792 1727096151.14798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/ /root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/AnsiballZ_command.py && sleep 0' 11792 1727096151.16101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096151.16196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096151.16300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096151.16318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096151.16339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096151.16461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096151.18403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096151.18682: stderr chunk (state=3): >>><<< 11792 1727096151.18686: stdout chunk (state=3): >>><<< 11792 1727096151.18689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096151.18691: _low_level_execute_command(): starting 11792 1727096151.18694: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/AnsiballZ_command.py && sleep 0' 11792 1727096151.19743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096151.19762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096151.19882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096151.19971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096151.20008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096151.20012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096151.20086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096151.36345: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::128/128 scope global dynamic noprefixroute \n valid_lft 228sec preferred_lft 228sec\n inet6 2001:db8::acc7:d9ff:fee6:c07e/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::acc7:d9ff:fee6:c07e/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-23 08:55:51.357574", "end": "2024-09-23 08:55:51.361615", "delta": "0:00:00.004041", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096151.38155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096151.38160: stdout chunk (state=3): >>><<< 11792 1727096151.38171: stderr chunk (state=3): >>><<< 11792 1727096151.38196: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::128/128 scope global dynamic noprefixroute \n valid_lft 228sec preferred_lft 228sec\n inet6 2001:db8::acc7:d9ff:fee6:c07e/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::acc7:d9ff:fee6:c07e/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-23 08:55:51.357574", "end": "2024-09-23 08:55:51.361615", "delta": "0:00:00.004041", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096151.38237: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096151.38349: _low_level_execute_command(): starting 11792 1727096151.38353: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096151.0493836-13421-181385549676723/ > /dev/null 2>&1 && sleep 0' 11792 1727096151.38897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096151.38916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096151.38932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096151.38953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096151.38974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096151.39076: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096151.39097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096151.39115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096151.39194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096151.41161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096151.41480: stdout chunk (state=3): >>><<< 11792 1727096151.41484: stderr chunk (state=3): >>><<< 11792 1727096151.41487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096151.41490: handler run complete 11792 1727096151.41492: Evaluated conditional (False): False 11792 1727096151.41719: variable 'address' from source: include params 11792 1727096151.41730: variable 'result' from source: set_fact 11792 1727096151.41755: Evaluated conditional (address in result.stdout): True 11792 1727096151.41775: attempt loop complete, returning result 11792 1727096151.42073: _execute() done 11792 1727096151.42077: dumping result to json 11792 1727096151.42079: done dumping result, returning 11792 1727096151.42081: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 [0afff68d-5257-d9c7-3fc0-000000000652] 11792 1727096151.42084: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000652 11792 1727096151.42159: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000652 11792 1727096151.42163: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.004041", "end": "2024-09-23 08:55:51.361615", "rc": 0, "start": "2024-09-23 08:55:51.357574" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::128/128 scope global dynamic noprefixroute valid_lft 228sec preferred_lft 228sec inet6 2001:db8::acc7:d9ff:fee6:c07e/64 scope global dynamic noprefixroute valid_lft 1799sec preferred_lft 1799sec inet6 fe80::acc7:d9ff:fee6:c07e/64 scope link noprefixroute valid_lft forever preferred_lft forever 11792 1727096151.42243: no more pending results, returning what we have 11792 1727096151.42247: results queue empty 11792 1727096151.42247: checking for any_errors_fatal 11792 1727096151.42249: done checking for any_errors_fatal 11792 1727096151.42250: checking for max_fail_percentage 11792 1727096151.42251: done checking for max_fail_percentage 11792 1727096151.42254: checking to see if all hosts have failed and the running result is not ok 11792 1727096151.42255: done checking to see if all hosts have failed 11792 1727096151.42256: getting the remaining hosts for this loop 11792 1727096151.42257: done getting the remaining hosts for this loop 11792 1727096151.42261: getting the next task for host managed_node2 11792 1727096151.42271: done getting next task for host managed_node2 11792 1727096151.42274: ^ task is: TASK: Conditional asserts 11792 1727096151.42277: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096151.42282: getting variables 11792 1727096151.42283: in VariableManager get_vars() 11792 1727096151.42313: Calling all_inventory to load vars for managed_node2 11792 1727096151.42316: Calling groups_inventory to load vars for managed_node2 11792 1727096151.42319: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096151.42330: Calling all_plugins_play to load vars for managed_node2 11792 1727096151.42333: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096151.42335: Calling groups_plugins_play to load vars for managed_node2 11792 1727096151.43696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096151.46427: done with get_vars() 11792 1727096151.46586: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Monday 23 September 2024 08:55:51 -0400 (0:00:00.558) 0:00:33.747 ****** 11792 1727096151.46825: entering _queue_task() for managed_node2/include_tasks 11792 1727096151.47682: worker is 1 (out of 1 available) 11792 1727096151.47693: exiting _queue_task() for managed_node2/include_tasks 11792 1727096151.47705: done queuing things up, now waiting for results queue to drain 11792 1727096151.47706: waiting for pending results... 11792 1727096151.48000: running TaskExecutor() for managed_node2/TASK: Conditional asserts 11792 1727096151.48259: in run() - task 0afff68d-5257-d9c7-3fc0-00000000008e 11792 1727096151.48277: variable 'ansible_search_path' from source: unknown 11792 1727096151.48281: variable 'ansible_search_path' from source: unknown 11792 1727096151.48859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096151.53130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096151.53314: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096151.53319: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096151.53321: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096151.53330: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096151.53424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096151.53456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096151.53599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096151.53637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096151.53655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096151.54047: dumping result to json 11792 1727096151.54050: done dumping result, returning 11792 1727096151.54058: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0afff68d-5257-d9c7-3fc0-00000000008e] 11792 1727096151.54064: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008e 11792 1727096151.54234: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008e 11792 1727096151.54238: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 11792 1727096151.54307: no more pending results, returning what we have 11792 1727096151.54311: results queue empty 11792 1727096151.54312: checking for any_errors_fatal 11792 1727096151.54322: done checking for any_errors_fatal 11792 1727096151.54323: checking for max_fail_percentage 11792 1727096151.54325: done checking for max_fail_percentage 11792 1727096151.54326: checking to see if all hosts have failed and the running result is not ok 11792 1727096151.54327: done checking to see if all hosts have failed 11792 1727096151.54327: getting the remaining hosts for this loop 11792 1727096151.54329: done getting the remaining hosts for this loop 11792 1727096151.54333: getting the next task for host managed_node2 11792 1727096151.54339: done getting next task for host managed_node2 11792 1727096151.54341: ^ task is: TASK: Success in test '{{ lsr_description }}' 11792 1727096151.54344: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096151.54348: getting variables 11792 1727096151.54349: in VariableManager get_vars() 11792 1727096151.54387: Calling all_inventory to load vars for managed_node2 11792 1727096151.54390: Calling groups_inventory to load vars for managed_node2 11792 1727096151.54393: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096151.54519: Calling all_plugins_play to load vars for managed_node2 11792 1727096151.54522: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096151.54525: Calling groups_plugins_play to load vars for managed_node2 11792 1727096151.59230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096151.61730: done with get_vars() 11792 1727096151.61774: done getting variables 11792 1727096151.61858: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096151.61992: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Monday 23 September 2024 08:55:51 -0400 (0:00:00.151) 0:00:33.899 ****** 11792 1727096151.62023: entering _queue_task() for managed_node2/debug 11792 1727096151.62810: worker is 1 (out of 1 available) 11792 1727096151.62821: exiting _queue_task() for managed_node2/debug 11792 1727096151.62832: done queuing things up, now waiting for results queue to drain 11792 1727096151.62834: waiting for pending results... 11792 1727096151.63135: running TaskExecutor() for managed_node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 11792 1727096151.63141: in run() - task 0afff68d-5257-d9c7-3fc0-00000000008f 11792 1727096151.63144: variable 'ansible_search_path' from source: unknown 11792 1727096151.63147: variable 'ansible_search_path' from source: unknown 11792 1727096151.63170: calling self._execute() 11792 1727096151.63357: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096151.63385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096151.63405: variable 'omit' from source: magic vars 11792 1727096151.64240: variable 'ansible_distribution_major_version' from source: facts 11792 1727096151.64294: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096151.64323: variable 'omit' from source: magic vars 11792 1727096151.64432: variable 'omit' from source: magic vars 11792 1727096151.64555: variable 'lsr_description' from source: include params 11792 1727096151.64663: variable 'omit' from source: magic vars 11792 1727096151.64785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096151.64878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096151.64893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096151.64913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096151.64938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096151.65047: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096151.65050: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096151.65056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096151.65156: Set connection var ansible_timeout to 10 11792 1727096151.65173: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096151.65193: Set connection var ansible_shell_executable to /bin/sh 11792 1727096151.65205: Set connection var ansible_pipelining to False 11792 1727096151.65213: Set connection var ansible_shell_type to sh 11792 1727096151.65270: Set connection var ansible_connection to ssh 11792 1727096151.65442: variable 'ansible_shell_executable' from source: unknown 11792 1727096151.65446: variable 'ansible_connection' from source: unknown 11792 1727096151.65448: variable 'ansible_module_compression' from source: unknown 11792 1727096151.65451: variable 'ansible_shell_type' from source: unknown 11792 1727096151.65455: variable 'ansible_shell_executable' from source: unknown 11792 1727096151.65457: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096151.65459: variable 'ansible_pipelining' from source: unknown 11792 1727096151.65461: variable 'ansible_timeout' from source: unknown 11792 1727096151.65463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096151.65596: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096151.65622: variable 'omit' from source: magic vars 11792 1727096151.65642: starting attempt loop 11792 1727096151.65651: running the handler 11792 1727096151.65706: handler run complete 11792 1727096151.65738: attempt loop complete, returning result 11792 1727096151.65745: _execute() done 11792 1727096151.65748: dumping result to json 11792 1727096151.65751: done dumping result, returning 11792 1727096151.65854: done running TaskExecutor() for managed_node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [0afff68d-5257-d9c7-3fc0-00000000008f] 11792 1727096151.65859: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008f ok: [managed_node2] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 11792 1727096151.66120: no more pending results, returning what we have 11792 1727096151.66124: results queue empty 11792 1727096151.66127: checking for any_errors_fatal 11792 1727096151.66135: done checking for any_errors_fatal 11792 1727096151.66136: checking for max_fail_percentage 11792 1727096151.66138: done checking for max_fail_percentage 11792 1727096151.66139: checking to see if all hosts have failed and the running result is not ok 11792 1727096151.66140: done checking to see if all hosts have failed 11792 1727096151.66141: getting the remaining hosts for this loop 11792 1727096151.66143: done getting the remaining hosts for this loop 11792 1727096151.66147: getting the next task for host managed_node2 11792 1727096151.66159: done getting next task for host managed_node2 11792 1727096151.66162: ^ task is: TASK: Cleanup 11792 1727096151.66166: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096151.66173: getting variables 11792 1727096151.66175: in VariableManager get_vars() 11792 1727096151.66210: Calling all_inventory to load vars for managed_node2 11792 1727096151.66213: Calling groups_inventory to load vars for managed_node2 11792 1727096151.66216: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096151.66229: Calling all_plugins_play to load vars for managed_node2 11792 1727096151.66233: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096151.66236: Calling groups_plugins_play to load vars for managed_node2 11792 1727096151.66782: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000008f 11792 1727096151.66786: WORKER PROCESS EXITING 11792 1727096151.78746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096151.80682: done with get_vars() 11792 1727096151.80742: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Monday 23 September 2024 08:55:51 -0400 (0:00:00.188) 0:00:34.087 ****** 11792 1727096151.80876: entering _queue_task() for managed_node2/include_tasks 11792 1727096151.81428: worker is 1 (out of 1 available) 11792 1727096151.81438: exiting _queue_task() for managed_node2/include_tasks 11792 1727096151.81451: done queuing things up, now waiting for results queue to drain 11792 1727096151.81456: waiting for pending results... 11792 1727096151.81770: running TaskExecutor() for managed_node2/TASK: Cleanup 11792 1727096151.81901: in run() - task 0afff68d-5257-d9c7-3fc0-000000000093 11792 1727096151.82044: variable 'ansible_search_path' from source: unknown 11792 1727096151.82050: variable 'ansible_search_path' from source: unknown 11792 1727096151.82054: variable 'lsr_cleanup' from source: include params 11792 1727096151.82312: variable 'lsr_cleanup' from source: include params 11792 1727096151.82451: variable 'omit' from source: magic vars 11792 1727096151.82655: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096151.82675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096151.82693: variable 'omit' from source: magic vars 11792 1727096151.82986: variable 'ansible_distribution_major_version' from source: facts 11792 1727096151.83016: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096151.83126: variable 'item' from source: unknown 11792 1727096151.83134: variable 'item' from source: unknown 11792 1727096151.83284: variable 'item' from source: unknown 11792 1727096151.83429: variable 'item' from source: unknown 11792 1727096151.84014: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096151.84017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096151.84020: variable 'omit' from source: magic vars 11792 1727096151.84026: variable 'ansible_distribution_major_version' from source: facts 11792 1727096151.84028: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096151.84050: variable 'item' from source: unknown 11792 1727096151.84146: variable 'item' from source: unknown 11792 1727096151.84190: variable 'item' from source: unknown 11792 1727096151.84262: variable 'item' from source: unknown 11792 1727096151.84436: dumping result to json 11792 1727096151.84439: done dumping result, returning 11792 1727096151.84442: done running TaskExecutor() for managed_node2/TASK: Cleanup [0afff68d-5257-d9c7-3fc0-000000000093] 11792 1727096151.84444: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000093 11792 1727096151.84519: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000093 11792 1727096151.84523: WORKER PROCESS EXITING 11792 1727096151.84571: no more pending results, returning what we have 11792 1727096151.84577: in VariableManager get_vars() 11792 1727096151.84624: Calling all_inventory to load vars for managed_node2 11792 1727096151.84629: Calling groups_inventory to load vars for managed_node2 11792 1727096151.84634: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096151.84649: Calling all_plugins_play to load vars for managed_node2 11792 1727096151.84655: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096151.84659: Calling groups_plugins_play to load vars for managed_node2 11792 1727096151.86596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096151.88307: done with get_vars() 11792 1727096151.88335: variable 'ansible_search_path' from source: unknown 11792 1727096151.88336: variable 'ansible_search_path' from source: unknown 11792 1727096151.88392: variable 'ansible_search_path' from source: unknown 11792 1727096151.88393: variable 'ansible_search_path' from source: unknown 11792 1727096151.88424: we have included files to process 11792 1727096151.88425: generating all_blocks data 11792 1727096151.88428: done generating all_blocks data 11792 1727096151.88433: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11792 1727096151.88434: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11792 1727096151.88437: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11792 1727096151.88701: in VariableManager get_vars() 11792 1727096151.88723: done with get_vars() 11792 1727096151.88728: variable 'omit' from source: magic vars 11792 1727096151.88776: variable 'omit' from source: magic vars 11792 1727096151.88836: in VariableManager get_vars() 11792 1727096151.88847: done with get_vars() 11792 1727096151.88877: in VariableManager get_vars() 11792 1727096151.88896: done with get_vars() 11792 1727096151.88935: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11792 1727096151.89134: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11792 1727096151.89222: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11792 1727096151.89638: in VariableManager get_vars() 11792 1727096151.89666: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11792 1727096151.91774: done processing included file 11792 1727096151.91777: iterating over new_blocks loaded from include file 11792 1727096151.91779: in VariableManager get_vars() 11792 1727096151.91888: done with get_vars() 11792 1727096151.91890: filtering new block on tags 11792 1727096151.92264: done filtering new block on tags 11792 1727096151.92270: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node2 => (item=tasks/cleanup_bond_profile+device.yml) 11792 1727096151.92276: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11792 1727096151.92277: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11792 1727096151.92280: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11792 1727096151.92711: done processing included file 11792 1727096151.92713: iterating over new_blocks loaded from include file 11792 1727096151.92714: in VariableManager get_vars() 11792 1727096151.92730: done with get_vars() 11792 1727096151.92732: filtering new block on tags 11792 1727096151.92763: done filtering new block on tags 11792 1727096151.92765: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node2 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 11792 1727096151.92797: extending task lists for all hosts with included blocks 11792 1727096151.96301: done extending task lists 11792 1727096151.96303: done processing included files 11792 1727096151.96304: results queue empty 11792 1727096151.96305: checking for any_errors_fatal 11792 1727096151.96310: done checking for any_errors_fatal 11792 1727096151.96311: checking for max_fail_percentage 11792 1727096151.96312: done checking for max_fail_percentage 11792 1727096151.96312: checking to see if all hosts have failed and the running result is not ok 11792 1727096151.96313: done checking to see if all hosts have failed 11792 1727096151.96314: getting the remaining hosts for this loop 11792 1727096151.96315: done getting the remaining hosts for this loop 11792 1727096151.96318: getting the next task for host managed_node2 11792 1727096151.96323: done getting next task for host managed_node2 11792 1727096151.96449: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11792 1727096151.96456: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096151.96470: getting variables 11792 1727096151.96472: in VariableManager get_vars() 11792 1727096151.96492: Calling all_inventory to load vars for managed_node2 11792 1727096151.96494: Calling groups_inventory to load vars for managed_node2 11792 1727096151.96497: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096151.96502: Calling all_plugins_play to load vars for managed_node2 11792 1727096151.96505: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096151.96508: Calling groups_plugins_play to load vars for managed_node2 11792 1727096151.99338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096152.02867: done with get_vars() 11792 1727096152.02909: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:52 -0400 (0:00:00.222) 0:00:34.310 ****** 11792 1727096152.03100: entering _queue_task() for managed_node2/include_tasks 11792 1727096152.03858: worker is 1 (out of 1 available) 11792 1727096152.03874: exiting _queue_task() for managed_node2/include_tasks 11792 1727096152.03887: done queuing things up, now waiting for results queue to drain 11792 1727096152.03889: waiting for pending results... 11792 1727096152.04399: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11792 1727096152.04774: in run() - task 0afff68d-5257-d9c7-3fc0-000000000693 11792 1727096152.04779: variable 'ansible_search_path' from source: unknown 11792 1727096152.04782: variable 'ansible_search_path' from source: unknown 11792 1727096152.04791: calling self._execute() 11792 1727096152.05035: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096152.05039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096152.05043: variable 'omit' from source: magic vars 11792 1727096152.05901: variable 'ansible_distribution_major_version' from source: facts 11792 1727096152.05905: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096152.05908: _execute() done 11792 1727096152.05910: dumping result to json 11792 1727096152.05912: done dumping result, returning 11792 1727096152.05914: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-d9c7-3fc0-000000000693] 11792 1727096152.05916: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000693 11792 1727096152.06085: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000693 11792 1727096152.06089: WORKER PROCESS EXITING 11792 1727096152.06143: no more pending results, returning what we have 11792 1727096152.06149: in VariableManager get_vars() 11792 1727096152.06196: Calling all_inventory to load vars for managed_node2 11792 1727096152.06200: Calling groups_inventory to load vars for managed_node2 11792 1727096152.06202: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096152.06215: Calling all_plugins_play to load vars for managed_node2 11792 1727096152.06217: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096152.06222: Calling groups_plugins_play to load vars for managed_node2 11792 1727096152.08355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096152.09522: done with get_vars() 11792 1727096152.09544: variable 'ansible_search_path' from source: unknown 11792 1727096152.09545: variable 'ansible_search_path' from source: unknown 11792 1727096152.09591: we have included files to process 11792 1727096152.09592: generating all_blocks data 11792 1727096152.09594: done generating all_blocks data 11792 1727096152.09596: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096152.09597: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096152.09599: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096152.10162: done processing included file 11792 1727096152.10165: iterating over new_blocks loaded from include file 11792 1727096152.10166: in VariableManager get_vars() 11792 1727096152.10192: done with get_vars() 11792 1727096152.10193: filtering new block on tags 11792 1727096152.10215: done filtering new block on tags 11792 1727096152.10220: in VariableManager get_vars() 11792 1727096152.10248: done with get_vars() 11792 1727096152.10250: filtering new block on tags 11792 1727096152.10289: done filtering new block on tags 11792 1727096152.10293: in VariableManager get_vars() 11792 1727096152.10317: done with get_vars() 11792 1727096152.10319: filtering new block on tags 11792 1727096152.10348: done filtering new block on tags 11792 1727096152.10349: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11792 1727096152.10354: extending task lists for all hosts with included blocks 11792 1727096152.11929: done extending task lists 11792 1727096152.11931: done processing included files 11792 1727096152.11932: results queue empty 11792 1727096152.11933: checking for any_errors_fatal 11792 1727096152.11937: done checking for any_errors_fatal 11792 1727096152.11938: checking for max_fail_percentage 11792 1727096152.11939: done checking for max_fail_percentage 11792 1727096152.11940: checking to see if all hosts have failed and the running result is not ok 11792 1727096152.11940: done checking to see if all hosts have failed 11792 1727096152.11941: getting the remaining hosts for this loop 11792 1727096152.11943: done getting the remaining hosts for this loop 11792 1727096152.11945: getting the next task for host managed_node2 11792 1727096152.11954: done getting next task for host managed_node2 11792 1727096152.11958: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11792 1727096152.11962: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096152.12004: getting variables 11792 1727096152.12006: in VariableManager get_vars() 11792 1727096152.12024: Calling all_inventory to load vars for managed_node2 11792 1727096152.12027: Calling groups_inventory to load vars for managed_node2 11792 1727096152.12029: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096152.12035: Calling all_plugins_play to load vars for managed_node2 11792 1727096152.12038: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096152.12041: Calling groups_plugins_play to load vars for managed_node2 11792 1727096152.13204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096152.15291: done with get_vars() 11792 1727096152.15318: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:55:52 -0400 (0:00:00.125) 0:00:34.435 ****** 11792 1727096152.15610: entering _queue_task() for managed_node2/setup 11792 1727096152.16190: worker is 1 (out of 1 available) 11792 1727096152.16200: exiting _queue_task() for managed_node2/setup 11792 1727096152.16211: done queuing things up, now waiting for results queue to drain 11792 1727096152.16213: waiting for pending results... 11792 1727096152.16486: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11792 1727096152.16546: in run() - task 0afff68d-5257-d9c7-3fc0-0000000007c9 11792 1727096152.16555: variable 'ansible_search_path' from source: unknown 11792 1727096152.16559: variable 'ansible_search_path' from source: unknown 11792 1727096152.16656: calling self._execute() 11792 1727096152.16676: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096152.16685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096152.16694: variable 'omit' from source: magic vars 11792 1727096152.17071: variable 'ansible_distribution_major_version' from source: facts 11792 1727096152.17083: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096152.17299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096152.19498: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096152.19559: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096152.19598: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096152.19633: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096152.19659: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096152.19743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096152.19773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096152.19798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096152.19840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096152.19856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096152.19909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096152.19936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096152.19959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096152.20034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096152.20037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096152.20172: variable '__network_required_facts' from source: role '' defaults 11792 1727096152.20190: variable 'ansible_facts' from source: unknown 11792 1727096152.20943: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11792 1727096152.20947: when evaluation is False, skipping this task 11792 1727096152.20950: _execute() done 11792 1727096152.20955: dumping result to json 11792 1727096152.20958: done dumping result, returning 11792 1727096152.20961: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-d9c7-3fc0-0000000007c9] 11792 1727096152.21014: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007c9 11792 1727096152.21083: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007c9 11792 1727096152.21086: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096152.21157: no more pending results, returning what we have 11792 1727096152.21162: results queue empty 11792 1727096152.21163: checking for any_errors_fatal 11792 1727096152.21165: done checking for any_errors_fatal 11792 1727096152.21165: checking for max_fail_percentage 11792 1727096152.21169: done checking for max_fail_percentage 11792 1727096152.21170: checking to see if all hosts have failed and the running result is not ok 11792 1727096152.21171: done checking to see if all hosts have failed 11792 1727096152.21171: getting the remaining hosts for this loop 11792 1727096152.21173: done getting the remaining hosts for this loop 11792 1727096152.21177: getting the next task for host managed_node2 11792 1727096152.21188: done getting next task for host managed_node2 11792 1727096152.21192: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11792 1727096152.21198: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096152.21217: getting variables 11792 1727096152.21219: in VariableManager get_vars() 11792 1727096152.21261: Calling all_inventory to load vars for managed_node2 11792 1727096152.21264: Calling groups_inventory to load vars for managed_node2 11792 1727096152.21468: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096152.21480: Calling all_plugins_play to load vars for managed_node2 11792 1727096152.21483: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096152.21493: Calling groups_plugins_play to load vars for managed_node2 11792 1727096152.22802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096152.24588: done with get_vars() 11792 1727096152.24622: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:55:52 -0400 (0:00:00.091) 0:00:34.526 ****** 11792 1727096152.24730: entering _queue_task() for managed_node2/stat 11792 1727096152.25330: worker is 1 (out of 1 available) 11792 1727096152.25348: exiting _queue_task() for managed_node2/stat 11792 1727096152.25364: done queuing things up, now waiting for results queue to drain 11792 1727096152.25366: waiting for pending results... 11792 1727096152.25996: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11792 1727096152.26032: in run() - task 0afff68d-5257-d9c7-3fc0-0000000007cb 11792 1727096152.26050: variable 'ansible_search_path' from source: unknown 11792 1727096152.26056: variable 'ansible_search_path' from source: unknown 11792 1727096152.26153: calling self._execute() 11792 1727096152.26281: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096152.26285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096152.26292: variable 'omit' from source: magic vars 11792 1727096152.26718: variable 'ansible_distribution_major_version' from source: facts 11792 1727096152.26721: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096152.26974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096152.27161: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096152.27208: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096152.27241: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096152.27280: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096152.27424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096152.27429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096152.27432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096152.27532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096152.27535: variable '__network_is_ostree' from source: set_fact 11792 1727096152.27538: Evaluated conditional (not __network_is_ostree is defined): False 11792 1727096152.27541: when evaluation is False, skipping this task 11792 1727096152.27543: _execute() done 11792 1727096152.27545: dumping result to json 11792 1727096152.27548: done dumping result, returning 11792 1727096152.27640: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-d9c7-3fc0-0000000007cb] 11792 1727096152.27643: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007cb 11792 1727096152.27712: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007cb 11792 1727096152.27716: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11792 1727096152.27771: no more pending results, returning what we have 11792 1727096152.27775: results queue empty 11792 1727096152.27776: checking for any_errors_fatal 11792 1727096152.27785: done checking for any_errors_fatal 11792 1727096152.27785: checking for max_fail_percentage 11792 1727096152.27787: done checking for max_fail_percentage 11792 1727096152.27788: checking to see if all hosts have failed and the running result is not ok 11792 1727096152.27789: done checking to see if all hosts have failed 11792 1727096152.27790: getting the remaining hosts for this loop 11792 1727096152.27792: done getting the remaining hosts for this loop 11792 1727096152.27796: getting the next task for host managed_node2 11792 1727096152.27804: done getting next task for host managed_node2 11792 1727096152.27807: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11792 1727096152.27813: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096152.27831: getting variables 11792 1727096152.27832: in VariableManager get_vars() 11792 1727096152.27876: Calling all_inventory to load vars for managed_node2 11792 1727096152.27879: Calling groups_inventory to load vars for managed_node2 11792 1727096152.27882: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096152.27894: Calling all_plugins_play to load vars for managed_node2 11792 1727096152.27897: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096152.27899: Calling groups_plugins_play to load vars for managed_node2 11792 1727096152.30321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096152.31304: done with get_vars() 11792 1727096152.31327: done getting variables 11792 1727096152.31374: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:55:52 -0400 (0:00:00.066) 0:00:34.593 ****** 11792 1727096152.31403: entering _queue_task() for managed_node2/set_fact 11792 1727096152.31669: worker is 1 (out of 1 available) 11792 1727096152.31684: exiting _queue_task() for managed_node2/set_fact 11792 1727096152.31698: done queuing things up, now waiting for results queue to drain 11792 1727096152.31700: waiting for pending results... 11792 1727096152.31892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11792 1727096152.32002: in run() - task 0afff68d-5257-d9c7-3fc0-0000000007cc 11792 1727096152.32014: variable 'ansible_search_path' from source: unknown 11792 1727096152.32018: variable 'ansible_search_path' from source: unknown 11792 1727096152.32050: calling self._execute() 11792 1727096152.32120: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096152.32124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096152.32134: variable 'omit' from source: magic vars 11792 1727096152.32427: variable 'ansible_distribution_major_version' from source: facts 11792 1727096152.32436: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096152.32554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096152.32764: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096152.32814: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096152.32916: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096152.32926: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096152.32999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096152.33022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096152.33048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096152.33079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096152.33230: variable '__network_is_ostree' from source: set_fact 11792 1727096152.33233: Evaluated conditional (not __network_is_ostree is defined): False 11792 1727096152.33236: when evaluation is False, skipping this task 11792 1727096152.33238: _execute() done 11792 1727096152.33240: dumping result to json 11792 1727096152.33242: done dumping result, returning 11792 1727096152.33244: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-d9c7-3fc0-0000000007cc] 11792 1727096152.33246: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007cc 11792 1727096152.33476: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007cc 11792 1727096152.33480: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11792 1727096152.33525: no more pending results, returning what we have 11792 1727096152.33528: results queue empty 11792 1727096152.33529: checking for any_errors_fatal 11792 1727096152.33536: done checking for any_errors_fatal 11792 1727096152.33536: checking for max_fail_percentage 11792 1727096152.33538: done checking for max_fail_percentage 11792 1727096152.33539: checking to see if all hosts have failed and the running result is not ok 11792 1727096152.33540: done checking to see if all hosts have failed 11792 1727096152.33541: getting the remaining hosts for this loop 11792 1727096152.33542: done getting the remaining hosts for this loop 11792 1727096152.33545: getting the next task for host managed_node2 11792 1727096152.33556: done getting next task for host managed_node2 11792 1727096152.33559: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11792 1727096152.33564: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096152.33688: getting variables 11792 1727096152.33690: in VariableManager get_vars() 11792 1727096152.33726: Calling all_inventory to load vars for managed_node2 11792 1727096152.33729: Calling groups_inventory to load vars for managed_node2 11792 1727096152.33732: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096152.33742: Calling all_plugins_play to load vars for managed_node2 11792 1727096152.33745: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096152.33748: Calling groups_plugins_play to load vars for managed_node2 11792 1727096152.35614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096152.37505: done with get_vars() 11792 1727096152.37536: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:55:52 -0400 (0:00:00.062) 0:00:34.655 ****** 11792 1727096152.37642: entering _queue_task() for managed_node2/service_facts 11792 1727096152.38017: worker is 1 (out of 1 available) 11792 1727096152.38029: exiting _queue_task() for managed_node2/service_facts 11792 1727096152.38045: done queuing things up, now waiting for results queue to drain 11792 1727096152.38047: waiting for pending results... 11792 1727096152.38432: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11792 1727096152.38609: in run() - task 0afff68d-5257-d9c7-3fc0-0000000007ce 11792 1727096152.38886: variable 'ansible_search_path' from source: unknown 11792 1727096152.38890: variable 'ansible_search_path' from source: unknown 11792 1727096152.38929: calling self._execute() 11792 1727096152.39156: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096152.39160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096152.39164: variable 'omit' from source: magic vars 11792 1727096152.39979: variable 'ansible_distribution_major_version' from source: facts 11792 1727096152.39984: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096152.39987: variable 'omit' from source: magic vars 11792 1727096152.39989: variable 'omit' from source: magic vars 11792 1727096152.39992: variable 'omit' from source: magic vars 11792 1727096152.39995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096152.39997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096152.39999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096152.40001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096152.40003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096152.40033: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096152.40037: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096152.40039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096152.40125: Set connection var ansible_timeout to 10 11792 1727096152.40133: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096152.40143: Set connection var ansible_shell_executable to /bin/sh 11792 1727096152.40148: Set connection var ansible_pipelining to False 11792 1727096152.40151: Set connection var ansible_shell_type to sh 11792 1727096152.40153: Set connection var ansible_connection to ssh 11792 1727096152.40183: variable 'ansible_shell_executable' from source: unknown 11792 1727096152.40187: variable 'ansible_connection' from source: unknown 11792 1727096152.40190: variable 'ansible_module_compression' from source: unknown 11792 1727096152.40192: variable 'ansible_shell_type' from source: unknown 11792 1727096152.40194: variable 'ansible_shell_executable' from source: unknown 11792 1727096152.40197: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096152.40200: variable 'ansible_pipelining' from source: unknown 11792 1727096152.40202: variable 'ansible_timeout' from source: unknown 11792 1727096152.40206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096152.40450: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096152.40455: variable 'omit' from source: magic vars 11792 1727096152.40457: starting attempt loop 11792 1727096152.40460: running the handler 11792 1727096152.40462: _low_level_execute_command(): starting 11792 1727096152.40464: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096152.41192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096152.41302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096152.41329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096152.41360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096152.43316: stdout chunk (state=3): >>>/root <<< 11792 1727096152.43320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096152.43322: stderr chunk (state=3): >>><<< 11792 1727096152.43349: stdout chunk (state=3): >>><<< 11792 1727096152.43375: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096152.43624: _low_level_execute_command(): starting 11792 1727096152.43628: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783 `" && echo ansible-tmp-1727096152.435089-13470-13125834813783="` echo /root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783 `" ) && sleep 0' 11792 1727096152.44794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096152.44911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096152.44960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096152.44965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096152.44998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096152.47039: stdout chunk (state=3): >>>ansible-tmp-1727096152.435089-13470-13125834813783=/root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783 <<< 11792 1727096152.47132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096152.47165: stderr chunk (state=3): >>><<< 11792 1727096152.47176: stdout chunk (state=3): >>><<< 11792 1727096152.47187: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096152.435089-13470-13125834813783=/root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096152.47227: variable 'ansible_module_compression' from source: unknown 11792 1727096152.47275: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11792 1727096152.47377: variable 'ansible_facts' from source: unknown 11792 1727096152.47397: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/AnsiballZ_service_facts.py 11792 1727096152.47624: Sending initial data 11792 1727096152.47627: Sent initial data (160 bytes) 11792 1727096152.48098: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096152.48107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096152.48128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096152.48236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096152.48241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096152.48306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096152.49985: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096152.50021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096152.50052: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp17i6d01w /root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/AnsiballZ_service_facts.py <<< 11792 1727096152.50066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/AnsiballZ_service_facts.py" <<< 11792 1727096152.50089: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp17i6d01w" to remote "/root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/AnsiballZ_service_facts.py" <<< 11792 1727096152.50604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096152.50650: stderr chunk (state=3): >>><<< 11792 1727096152.50656: stdout chunk (state=3): >>><<< 11792 1727096152.50714: done transferring module to remote 11792 1727096152.50725: _low_level_execute_command(): starting 11792 1727096152.50728: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/ /root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/AnsiballZ_service_facts.py && sleep 0' 11792 1727096152.51162: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096152.51199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096152.51202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096152.51206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096152.51208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096152.51250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096152.51261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096152.51306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096152.53231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096152.53248: stderr chunk (state=3): >>><<< 11792 1727096152.53263: stdout chunk (state=3): >>><<< 11792 1727096152.53296: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096152.53383: _low_level_execute_command(): starting 11792 1727096152.53387: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/AnsiballZ_service_facts.py && sleep 0' 11792 1727096152.53964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096152.53982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096152.53994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096152.54038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096152.54051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096152.54116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096154.25606: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 11792 1727096154.25627: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 11792 1727096154.25658: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 11792 1727096154.25662: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 11792 1727096154.25673: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11792 1727096154.27331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096154.27360: stderr chunk (state=3): >>><<< 11792 1727096154.27364: stdout chunk (state=3): >>><<< 11792 1727096154.27396: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096154.28099: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096154.28104: _low_level_execute_command(): starting 11792 1727096154.28109: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096152.435089-13470-13125834813783/ > /dev/null 2>&1 && sleep 0' 11792 1727096154.28601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096154.28607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096154.28609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096154.28612: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.28666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096154.28672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096154.28677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096154.28716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096154.30632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096154.30662: stderr chunk (state=3): >>><<< 11792 1727096154.30666: stdout chunk (state=3): >>><<< 11792 1727096154.30683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096154.30689: handler run complete 11792 1727096154.30807: variable 'ansible_facts' from source: unknown 11792 1727096154.30903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096154.31186: variable 'ansible_facts' from source: unknown 11792 1727096154.31272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096154.31388: attempt loop complete, returning result 11792 1727096154.31391: _execute() done 11792 1727096154.31394: dumping result to json 11792 1727096154.31433: done dumping result, returning 11792 1727096154.31441: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-d9c7-3fc0-0000000007ce] 11792 1727096154.31446: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007ce 11792 1727096154.32208: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007ce 11792 1727096154.32211: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096154.32263: no more pending results, returning what we have 11792 1727096154.32266: results queue empty 11792 1727096154.32266: checking for any_errors_fatal 11792 1727096154.32272: done checking for any_errors_fatal 11792 1727096154.32272: checking for max_fail_percentage 11792 1727096154.32273: done checking for max_fail_percentage 11792 1727096154.32274: checking to see if all hosts have failed and the running result is not ok 11792 1727096154.32274: done checking to see if all hosts have failed 11792 1727096154.32275: getting the remaining hosts for this loop 11792 1727096154.32276: done getting the remaining hosts for this loop 11792 1727096154.32278: getting the next task for host managed_node2 11792 1727096154.32282: done getting next task for host managed_node2 11792 1727096154.32285: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11792 1727096154.32289: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096154.32297: getting variables 11792 1727096154.32299: in VariableManager get_vars() 11792 1727096154.32320: Calling all_inventory to load vars for managed_node2 11792 1727096154.32322: Calling groups_inventory to load vars for managed_node2 11792 1727096154.32323: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096154.32330: Calling all_plugins_play to load vars for managed_node2 11792 1727096154.32331: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096154.32337: Calling groups_plugins_play to load vars for managed_node2 11792 1727096154.33016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096154.33911: done with get_vars() 11792 1727096154.33937: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:55:54 -0400 (0:00:01.963) 0:00:36.619 ****** 11792 1727096154.34015: entering _queue_task() for managed_node2/package_facts 11792 1727096154.34297: worker is 1 (out of 1 available) 11792 1727096154.34313: exiting _queue_task() for managed_node2/package_facts 11792 1727096154.34327: done queuing things up, now waiting for results queue to drain 11792 1727096154.34328: waiting for pending results... 11792 1727096154.34520: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11792 1727096154.34638: in run() - task 0afff68d-5257-d9c7-3fc0-0000000007cf 11792 1727096154.34651: variable 'ansible_search_path' from source: unknown 11792 1727096154.34657: variable 'ansible_search_path' from source: unknown 11792 1727096154.34690: calling self._execute() 11792 1727096154.34763: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096154.34772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096154.34784: variable 'omit' from source: magic vars 11792 1727096154.35072: variable 'ansible_distribution_major_version' from source: facts 11792 1727096154.35081: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096154.35088: variable 'omit' from source: magic vars 11792 1727096154.35155: variable 'omit' from source: magic vars 11792 1727096154.35179: variable 'omit' from source: magic vars 11792 1727096154.35217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096154.35242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096154.35259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096154.35275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096154.35284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096154.35307: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096154.35310: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096154.35312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096154.35386: Set connection var ansible_timeout to 10 11792 1727096154.35398: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096154.35493: Set connection var ansible_shell_executable to /bin/sh 11792 1727096154.35497: Set connection var ansible_pipelining to False 11792 1727096154.35499: Set connection var ansible_shell_type to sh 11792 1727096154.35501: Set connection var ansible_connection to ssh 11792 1727096154.35503: variable 'ansible_shell_executable' from source: unknown 11792 1727096154.35505: variable 'ansible_connection' from source: unknown 11792 1727096154.35508: variable 'ansible_module_compression' from source: unknown 11792 1727096154.35510: variable 'ansible_shell_type' from source: unknown 11792 1727096154.35512: variable 'ansible_shell_executable' from source: unknown 11792 1727096154.35513: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096154.35515: variable 'ansible_pipelining' from source: unknown 11792 1727096154.35517: variable 'ansible_timeout' from source: unknown 11792 1727096154.35519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096154.35874: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096154.35880: variable 'omit' from source: magic vars 11792 1727096154.35882: starting attempt loop 11792 1727096154.35885: running the handler 11792 1727096154.35887: _low_level_execute_command(): starting 11792 1727096154.35889: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096154.36351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096154.36362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096154.36382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096154.36398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096154.36411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096154.36420: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096154.36430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.36446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096154.36456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096154.36459: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096154.36471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096154.36483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096154.36495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096154.36503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096154.36511: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096154.36521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.36590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096154.36605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096154.36651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096154.38374: stdout chunk (state=3): >>>/root <<< 11792 1727096154.38466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096154.38505: stderr chunk (state=3): >>><<< 11792 1727096154.38508: stdout chunk (state=3): >>><<< 11792 1727096154.38530: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096154.38542: _low_level_execute_command(): starting 11792 1727096154.38549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850 `" && echo ansible-tmp-1727096154.3853006-13560-142617367435850="` echo /root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850 `" ) && sleep 0' 11792 1727096154.39102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096154.39106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.39109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096154.39111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.39160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096154.39164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096154.39188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096154.39215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096154.41255: stdout chunk (state=3): >>>ansible-tmp-1727096154.3853006-13560-142617367435850=/root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850 <<< 11792 1727096154.41356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096154.41386: stderr chunk (state=3): >>><<< 11792 1727096154.41389: stdout chunk (state=3): >>><<< 11792 1727096154.41405: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096154.3853006-13560-142617367435850=/root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096154.41449: variable 'ansible_module_compression' from source: unknown 11792 1727096154.41493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11792 1727096154.41546: variable 'ansible_facts' from source: unknown 11792 1727096154.41667: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/AnsiballZ_package_facts.py 11792 1727096154.41788: Sending initial data 11792 1727096154.41791: Sent initial data (162 bytes) 11792 1727096154.42251: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096154.42255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.42258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096154.42260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.42311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096154.42314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096154.42317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096154.42365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096154.44037: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096154.44057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096154.44091: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpr_cft1hn /root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/AnsiballZ_package_facts.py <<< 11792 1727096154.44104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/AnsiballZ_package_facts.py" <<< 11792 1727096154.44123: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpr_cft1hn" to remote "/root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/AnsiballZ_package_facts.py" <<< 11792 1727096154.44129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/AnsiballZ_package_facts.py" <<< 11792 1727096154.45155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096154.45199: stderr chunk (state=3): >>><<< 11792 1727096154.45203: stdout chunk (state=3): >>><<< 11792 1727096154.45239: done transferring module to remote 11792 1727096154.45249: _low_level_execute_command(): starting 11792 1727096154.45256: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/ /root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/AnsiballZ_package_facts.py && sleep 0' 11792 1727096154.45730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096154.45738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.45741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096154.45743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.45791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096154.45794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096154.45796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096154.45840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096154.47747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096154.47779: stderr chunk (state=3): >>><<< 11792 1727096154.47782: stdout chunk (state=3): >>><<< 11792 1727096154.47798: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096154.47801: _low_level_execute_command(): starting 11792 1727096154.47806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/AnsiballZ_package_facts.py && sleep 0' 11792 1727096154.48255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096154.48260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.48286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096154.48289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096154.48354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096154.48362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096154.48403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096154.94076: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 11792 1727096154.94205: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11792 1727096154.94235: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11792 1727096154.96445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096154.96449: stdout chunk (state=3): >>><<< 11792 1727096154.96455: stderr chunk (state=3): >>><<< 11792 1727096154.96470: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096155.00572: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096155.00640: _low_level_execute_command(): starting 11792 1727096155.00644: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096154.3853006-13560-142617367435850/ > /dev/null 2>&1 && sleep 0' 11792 1727096155.01501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096155.01506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096155.01572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096155.01586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096155.01649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096155.03775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096155.03781: stdout chunk (state=3): >>><<< 11792 1727096155.03784: stderr chunk (state=3): >>><<< 11792 1727096155.03787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096155.03791: handler run complete 11792 1727096155.05520: variable 'ansible_facts' from source: unknown 11792 1727096155.07154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.11363: variable 'ansible_facts' from source: unknown 11792 1727096155.11975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.12774: attempt loop complete, returning result 11792 1727096155.12778: _execute() done 11792 1727096155.12780: dumping result to json 11792 1727096155.12871: done dumping result, returning 11792 1727096155.12881: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-d9c7-3fc0-0000000007cf] 11792 1727096155.12886: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007cf ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096155.15749: no more pending results, returning what we have 11792 1727096155.15752: results queue empty 11792 1727096155.15753: checking for any_errors_fatal 11792 1727096155.15757: done checking for any_errors_fatal 11792 1727096155.15758: checking for max_fail_percentage 11792 1727096155.15760: done checking for max_fail_percentage 11792 1727096155.15760: checking to see if all hosts have failed and the running result is not ok 11792 1727096155.15761: done checking to see if all hosts have failed 11792 1727096155.15762: getting the remaining hosts for this loop 11792 1727096155.15763: done getting the remaining hosts for this loop 11792 1727096155.15766: getting the next task for host managed_node2 11792 1727096155.15775: done getting next task for host managed_node2 11792 1727096155.15778: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11792 1727096155.15783: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096155.15794: getting variables 11792 1727096155.15796: in VariableManager get_vars() 11792 1727096155.15825: Calling all_inventory to load vars for managed_node2 11792 1727096155.15829: Calling groups_inventory to load vars for managed_node2 11792 1727096155.15831: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096155.15840: Calling all_plugins_play to load vars for managed_node2 11792 1727096155.15843: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096155.15846: Calling groups_plugins_play to load vars for managed_node2 11792 1727096155.16384: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000007cf 11792 1727096155.16389: WORKER PROCESS EXITING 11792 1727096155.17840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.19540: done with get_vars() 11792 1727096155.19584: done getting variables 11792 1727096155.19646: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:55 -0400 (0:00:00.856) 0:00:37.476 ****** 11792 1727096155.19700: entering _queue_task() for managed_node2/debug 11792 1727096155.20086: worker is 1 (out of 1 available) 11792 1727096155.20099: exiting _queue_task() for managed_node2/debug 11792 1727096155.20231: done queuing things up, now waiting for results queue to drain 11792 1727096155.20234: waiting for pending results... 11792 1727096155.20441: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11792 1727096155.20610: in run() - task 0afff68d-5257-d9c7-3fc0-000000000694 11792 1727096155.20635: variable 'ansible_search_path' from source: unknown 11792 1727096155.20645: variable 'ansible_search_path' from source: unknown 11792 1727096155.20698: calling self._execute() 11792 1727096155.20806: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.20820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.20836: variable 'omit' from source: magic vars 11792 1727096155.21257: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.21279: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096155.21291: variable 'omit' from source: magic vars 11792 1727096155.21384: variable 'omit' from source: magic vars 11792 1727096155.21501: variable 'network_provider' from source: set_fact 11792 1727096155.21526: variable 'omit' from source: magic vars 11792 1727096155.21581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096155.21618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096155.21648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096155.21673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096155.21689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096155.21724: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096155.21759: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.21766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.21870: Set connection var ansible_timeout to 10 11792 1727096155.21976: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096155.21979: Set connection var ansible_shell_executable to /bin/sh 11792 1727096155.21981: Set connection var ansible_pipelining to False 11792 1727096155.21983: Set connection var ansible_shell_type to sh 11792 1727096155.21985: Set connection var ansible_connection to ssh 11792 1727096155.21987: variable 'ansible_shell_executable' from source: unknown 11792 1727096155.21989: variable 'ansible_connection' from source: unknown 11792 1727096155.21990: variable 'ansible_module_compression' from source: unknown 11792 1727096155.21992: variable 'ansible_shell_type' from source: unknown 11792 1727096155.21995: variable 'ansible_shell_executable' from source: unknown 11792 1727096155.21996: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.21998: variable 'ansible_pipelining' from source: unknown 11792 1727096155.22000: variable 'ansible_timeout' from source: unknown 11792 1727096155.22002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.22119: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096155.22194: variable 'omit' from source: magic vars 11792 1727096155.22197: starting attempt loop 11792 1727096155.22200: running the handler 11792 1727096155.22207: handler run complete 11792 1727096155.22232: attempt loop complete, returning result 11792 1727096155.22240: _execute() done 11792 1727096155.22248: dumping result to json 11792 1727096155.22255: done dumping result, returning 11792 1727096155.22271: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-d9c7-3fc0-000000000694] 11792 1727096155.22280: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000694 ok: [managed_node2] => {} MSG: Using network provider: nm 11792 1727096155.22529: no more pending results, returning what we have 11792 1727096155.22533: results queue empty 11792 1727096155.22534: checking for any_errors_fatal 11792 1727096155.22544: done checking for any_errors_fatal 11792 1727096155.22545: checking for max_fail_percentage 11792 1727096155.22547: done checking for max_fail_percentage 11792 1727096155.22548: checking to see if all hosts have failed and the running result is not ok 11792 1727096155.22549: done checking to see if all hosts have failed 11792 1727096155.22550: getting the remaining hosts for this loop 11792 1727096155.22552: done getting the remaining hosts for this loop 11792 1727096155.22555: getting the next task for host managed_node2 11792 1727096155.22564: done getting next task for host managed_node2 11792 1727096155.22570: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11792 1727096155.22576: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096155.22590: getting variables 11792 1727096155.22591: in VariableManager get_vars() 11792 1727096155.22750: Calling all_inventory to load vars for managed_node2 11792 1727096155.22753: Calling groups_inventory to load vars for managed_node2 11792 1727096155.22756: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096155.22766: Calling all_plugins_play to load vars for managed_node2 11792 1727096155.22771: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096155.22774: Calling groups_plugins_play to load vars for managed_node2 11792 1727096155.23360: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000694 11792 1727096155.23364: WORKER PROCESS EXITING 11792 1727096155.24496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.26730: done with get_vars() 11792 1727096155.26766: done getting variables 11792 1727096155.26936: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:55 -0400 (0:00:00.072) 0:00:37.549 ****** 11792 1727096155.27072: entering _queue_task() for managed_node2/fail 11792 1727096155.27678: worker is 1 (out of 1 available) 11792 1727096155.27690: exiting _queue_task() for managed_node2/fail 11792 1727096155.27702: done queuing things up, now waiting for results queue to drain 11792 1727096155.27704: waiting for pending results... 11792 1727096155.27972: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11792 1727096155.28147: in run() - task 0afff68d-5257-d9c7-3fc0-000000000695 11792 1727096155.28174: variable 'ansible_search_path' from source: unknown 11792 1727096155.28188: variable 'ansible_search_path' from source: unknown 11792 1727096155.28235: calling self._execute() 11792 1727096155.28344: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.28359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.28410: variable 'omit' from source: magic vars 11792 1727096155.28787: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.28806: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096155.28941: variable 'network_state' from source: role '' defaults 11792 1727096155.29063: Evaluated conditional (network_state != {}): False 11792 1727096155.29069: when evaluation is False, skipping this task 11792 1727096155.29072: _execute() done 11792 1727096155.29075: dumping result to json 11792 1727096155.29078: done dumping result, returning 11792 1727096155.29081: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-d9c7-3fc0-000000000695] 11792 1727096155.29084: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000695 11792 1727096155.29162: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000695 11792 1727096155.29277: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096155.29330: no more pending results, returning what we have 11792 1727096155.29334: results queue empty 11792 1727096155.29335: checking for any_errors_fatal 11792 1727096155.29346: done checking for any_errors_fatal 11792 1727096155.29347: checking for max_fail_percentage 11792 1727096155.29348: done checking for max_fail_percentage 11792 1727096155.29350: checking to see if all hosts have failed and the running result is not ok 11792 1727096155.29351: done checking to see if all hosts have failed 11792 1727096155.29352: getting the remaining hosts for this loop 11792 1727096155.29353: done getting the remaining hosts for this loop 11792 1727096155.29357: getting the next task for host managed_node2 11792 1727096155.29366: done getting next task for host managed_node2 11792 1727096155.29371: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11792 1727096155.29381: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096155.29400: getting variables 11792 1727096155.29402: in VariableManager get_vars() 11792 1727096155.29444: Calling all_inventory to load vars for managed_node2 11792 1727096155.29448: Calling groups_inventory to load vars for managed_node2 11792 1727096155.29450: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096155.29463: Calling all_plugins_play to load vars for managed_node2 11792 1727096155.29466: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096155.29574: Calling groups_plugins_play to load vars for managed_node2 11792 1727096155.31275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.32997: done with get_vars() 11792 1727096155.33033: done getting variables 11792 1727096155.33095: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:55 -0400 (0:00:00.061) 0:00:37.610 ****** 11792 1727096155.33134: entering _queue_task() for managed_node2/fail 11792 1727096155.33506: worker is 1 (out of 1 available) 11792 1727096155.33518: exiting _queue_task() for managed_node2/fail 11792 1727096155.33532: done queuing things up, now waiting for results queue to drain 11792 1727096155.33534: waiting for pending results... 11792 1727096155.33842: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11792 1727096155.34175: in run() - task 0afff68d-5257-d9c7-3fc0-000000000696 11792 1727096155.34179: variable 'ansible_search_path' from source: unknown 11792 1727096155.34183: variable 'ansible_search_path' from source: unknown 11792 1727096155.34187: calling self._execute() 11792 1727096155.34189: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.34198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.34212: variable 'omit' from source: magic vars 11792 1727096155.34592: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.34661: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096155.34844: variable 'network_state' from source: role '' defaults 11792 1727096155.35070: Evaluated conditional (network_state != {}): False 11792 1727096155.35074: when evaluation is False, skipping this task 11792 1727096155.35077: _execute() done 11792 1727096155.35079: dumping result to json 11792 1727096155.35081: done dumping result, returning 11792 1727096155.35083: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-d9c7-3fc0-000000000696] 11792 1727096155.35085: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000696 11792 1727096155.35160: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000696 11792 1727096155.35164: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096155.35224: no more pending results, returning what we have 11792 1727096155.35228: results queue empty 11792 1727096155.35229: checking for any_errors_fatal 11792 1727096155.35236: done checking for any_errors_fatal 11792 1727096155.35237: checking for max_fail_percentage 11792 1727096155.35239: done checking for max_fail_percentage 11792 1727096155.35240: checking to see if all hosts have failed and the running result is not ok 11792 1727096155.35241: done checking to see if all hosts have failed 11792 1727096155.35242: getting the remaining hosts for this loop 11792 1727096155.35243: done getting the remaining hosts for this loop 11792 1727096155.35247: getting the next task for host managed_node2 11792 1727096155.35256: done getting next task for host managed_node2 11792 1727096155.35260: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11792 1727096155.35265: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096155.35288: getting variables 11792 1727096155.35290: in VariableManager get_vars() 11792 1727096155.35330: Calling all_inventory to load vars for managed_node2 11792 1727096155.35333: Calling groups_inventory to load vars for managed_node2 11792 1727096155.35335: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096155.35348: Calling all_plugins_play to load vars for managed_node2 11792 1727096155.35351: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096155.35354: Calling groups_plugins_play to load vars for managed_node2 11792 1727096155.37450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.39419: done with get_vars() 11792 1727096155.39456: done getting variables 11792 1727096155.39520: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:55 -0400 (0:00:00.064) 0:00:37.674 ****** 11792 1727096155.39558: entering _queue_task() for managed_node2/fail 11792 1727096155.40335: worker is 1 (out of 1 available) 11792 1727096155.40349: exiting _queue_task() for managed_node2/fail 11792 1727096155.40363: done queuing things up, now waiting for results queue to drain 11792 1727096155.40364: waiting for pending results... 11792 1727096155.40839: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11792 1727096155.40912: in run() - task 0afff68d-5257-d9c7-3fc0-000000000697 11792 1727096155.41040: variable 'ansible_search_path' from source: unknown 11792 1727096155.41043: variable 'ansible_search_path' from source: unknown 11792 1727096155.41047: calling self._execute() 11792 1727096155.41089: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.41103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.41119: variable 'omit' from source: magic vars 11792 1727096155.41504: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.41521: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096155.41705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096155.46345: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096155.46580: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096155.46637: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096155.46724: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096155.46841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096155.46966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.47043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.47143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.47193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.47236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.47492: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.47514: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11792 1727096155.47819: variable 'ansible_distribution' from source: facts 11792 1727096155.47822: variable '__network_rh_distros' from source: role '' defaults 11792 1727096155.47825: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11792 1727096155.48363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.48474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.48477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.48520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.48603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.48655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.48737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.48766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.48821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.48841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.48901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.48939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.48973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.49023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.49042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.49389: variable 'network_connections' from source: task vars 11792 1727096155.49406: variable 'port2_profile' from source: play vars 11792 1727096155.49486: variable 'port2_profile' from source: play vars 11792 1727096155.49502: variable 'port1_profile' from source: play vars 11792 1727096155.49572: variable 'port1_profile' from source: play vars 11792 1727096155.49676: variable 'controller_profile' from source: play vars 11792 1727096155.49679: variable 'controller_profile' from source: play vars 11792 1727096155.49681: variable 'network_state' from source: role '' defaults 11792 1727096155.49729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096155.49925: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096155.49972: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096155.50014: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096155.50045: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096155.50110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096155.50136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096155.50165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.50197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096155.50240: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11792 1727096155.50248: when evaluation is False, skipping this task 11792 1727096155.50256: _execute() done 11792 1727096155.50264: dumping result to json 11792 1727096155.50274: done dumping result, returning 11792 1727096155.50327: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-d9c7-3fc0-000000000697] 11792 1727096155.50331: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000697 11792 1727096155.50652: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000697 11792 1727096155.50656: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11792 1727096155.50708: no more pending results, returning what we have 11792 1727096155.50712: results queue empty 11792 1727096155.50713: checking for any_errors_fatal 11792 1727096155.50722: done checking for any_errors_fatal 11792 1727096155.50722: checking for max_fail_percentage 11792 1727096155.50724: done checking for max_fail_percentage 11792 1727096155.50725: checking to see if all hosts have failed and the running result is not ok 11792 1727096155.50726: done checking to see if all hosts have failed 11792 1727096155.50726: getting the remaining hosts for this loop 11792 1727096155.50728: done getting the remaining hosts for this loop 11792 1727096155.50732: getting the next task for host managed_node2 11792 1727096155.50741: done getting next task for host managed_node2 11792 1727096155.50746: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11792 1727096155.50751: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096155.50872: getting variables 11792 1727096155.50875: in VariableManager get_vars() 11792 1727096155.50921: Calling all_inventory to load vars for managed_node2 11792 1727096155.50924: Calling groups_inventory to load vars for managed_node2 11792 1727096155.50927: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096155.50939: Calling all_plugins_play to load vars for managed_node2 11792 1727096155.50942: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096155.50945: Calling groups_plugins_play to load vars for managed_node2 11792 1727096155.53442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.55717: done with get_vars() 11792 1727096155.55751: done getting variables 11792 1727096155.55888: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:55 -0400 (0:00:00.163) 0:00:37.838 ****** 11792 1727096155.55927: entering _queue_task() for managed_node2/dnf 11792 1727096155.56779: worker is 1 (out of 1 available) 11792 1727096155.57081: exiting _queue_task() for managed_node2/dnf 11792 1727096155.57094: done queuing things up, now waiting for results queue to drain 11792 1727096155.57096: waiting for pending results... 11792 1727096155.57617: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11792 1727096155.57771: in run() - task 0afff68d-5257-d9c7-3fc0-000000000698 11792 1727096155.57838: variable 'ansible_search_path' from source: unknown 11792 1727096155.57845: variable 'ansible_search_path' from source: unknown 11792 1727096155.57870: calling self._execute() 11792 1727096155.58053: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.58058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.58060: variable 'omit' from source: magic vars 11792 1727096155.58386: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.58405: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096155.58615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096155.62863: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096155.62988: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096155.63049: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096155.63092: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096155.63145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096155.63221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.63363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.63366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.63371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.63373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.63488: variable 'ansible_distribution' from source: facts 11792 1727096155.63497: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.63519: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11792 1727096155.63644: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096155.63794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.63821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.63850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.63903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.63923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.63967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.63998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.64034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.64079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.64097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.64145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.64174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.64226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.64249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.64266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.64431: variable 'network_connections' from source: task vars 11792 1727096155.64551: variable 'port2_profile' from source: play vars 11792 1727096155.64554: variable 'port2_profile' from source: play vars 11792 1727096155.64556: variable 'port1_profile' from source: play vars 11792 1727096155.64608: variable 'port1_profile' from source: play vars 11792 1727096155.64619: variable 'controller_profile' from source: play vars 11792 1727096155.64687: variable 'controller_profile' from source: play vars 11792 1727096155.64890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096155.65418: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096155.65422: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096155.65425: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096155.65496: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096155.65618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096155.65976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096155.65979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.65982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096155.65984: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096155.66531: variable 'network_connections' from source: task vars 11792 1727096155.66559: variable 'port2_profile' from source: play vars 11792 1727096155.66675: variable 'port2_profile' from source: play vars 11792 1727096155.66689: variable 'port1_profile' from source: play vars 11792 1727096155.66759: variable 'port1_profile' from source: play vars 11792 1727096155.66775: variable 'controller_profile' from source: play vars 11792 1727096155.66835: variable 'controller_profile' from source: play vars 11792 1727096155.66874: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096155.66882: when evaluation is False, skipping this task 11792 1727096155.66888: _execute() done 11792 1727096155.66906: dumping result to json 11792 1727096155.66913: done dumping result, returning 11792 1727096155.66926: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000698] 11792 1727096155.66935: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000698 11792 1727096155.67061: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000698 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096155.67117: no more pending results, returning what we have 11792 1727096155.67121: results queue empty 11792 1727096155.67122: checking for any_errors_fatal 11792 1727096155.67132: done checking for any_errors_fatal 11792 1727096155.67133: checking for max_fail_percentage 11792 1727096155.67135: done checking for max_fail_percentage 11792 1727096155.67136: checking to see if all hosts have failed and the running result is not ok 11792 1727096155.67137: done checking to see if all hosts have failed 11792 1727096155.67137: getting the remaining hosts for this loop 11792 1727096155.67139: done getting the remaining hosts for this loop 11792 1727096155.67143: getting the next task for host managed_node2 11792 1727096155.67152: done getting next task for host managed_node2 11792 1727096155.67156: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11792 1727096155.67161: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096155.67185: getting variables 11792 1727096155.67187: in VariableManager get_vars() 11792 1727096155.67226: Calling all_inventory to load vars for managed_node2 11792 1727096155.67229: Calling groups_inventory to load vars for managed_node2 11792 1727096155.67231: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096155.67242: Calling all_plugins_play to load vars for managed_node2 11792 1727096155.67245: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096155.67248: Calling groups_plugins_play to load vars for managed_node2 11792 1727096155.67974: WORKER PROCESS EXITING 11792 1727096155.69137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.70756: done with get_vars() 11792 1727096155.70794: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11792 1727096155.70878: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:55 -0400 (0:00:00.149) 0:00:37.988 ****** 11792 1727096155.70914: entering _queue_task() for managed_node2/yum 11792 1727096155.71494: worker is 1 (out of 1 available) 11792 1727096155.71504: exiting _queue_task() for managed_node2/yum 11792 1727096155.71516: done queuing things up, now waiting for results queue to drain 11792 1727096155.71518: waiting for pending results... 11792 1727096155.71622: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11792 1727096155.71795: in run() - task 0afff68d-5257-d9c7-3fc0-000000000699 11792 1727096155.71819: variable 'ansible_search_path' from source: unknown 11792 1727096155.71826: variable 'ansible_search_path' from source: unknown 11792 1727096155.71875: calling self._execute() 11792 1727096155.71977: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.71990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.72006: variable 'omit' from source: magic vars 11792 1727096155.72406: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.72424: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096155.72613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096155.75059: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096155.75151: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096155.75194: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096155.75239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096155.75275: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096155.75364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.75399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.75435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.75487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.75548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.75622: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.75644: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11792 1727096155.75659: when evaluation is False, skipping this task 11792 1727096155.75672: _execute() done 11792 1727096155.75680: dumping result to json 11792 1727096155.75763: done dumping result, returning 11792 1727096155.75766: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000699] 11792 1727096155.75776: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000699 11792 1727096155.75850: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000699 11792 1727096155.75852: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11792 1727096155.75935: no more pending results, returning what we have 11792 1727096155.75940: results queue empty 11792 1727096155.75940: checking for any_errors_fatal 11792 1727096155.75949: done checking for any_errors_fatal 11792 1727096155.75950: checking for max_fail_percentage 11792 1727096155.75952: done checking for max_fail_percentage 11792 1727096155.75953: checking to see if all hosts have failed and the running result is not ok 11792 1727096155.75953: done checking to see if all hosts have failed 11792 1727096155.75954: getting the remaining hosts for this loop 11792 1727096155.75956: done getting the remaining hosts for this loop 11792 1727096155.75960: getting the next task for host managed_node2 11792 1727096155.75971: done getting next task for host managed_node2 11792 1727096155.75976: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11792 1727096155.75983: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096155.76000: getting variables 11792 1727096155.76002: in VariableManager get_vars() 11792 1727096155.76047: Calling all_inventory to load vars for managed_node2 11792 1727096155.76051: Calling groups_inventory to load vars for managed_node2 11792 1727096155.76053: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096155.76065: Calling all_plugins_play to load vars for managed_node2 11792 1727096155.76172: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096155.76181: Calling groups_plugins_play to load vars for managed_node2 11792 1727096155.77990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096155.87788: done with get_vars() 11792 1727096155.87820: done getting variables 11792 1727096155.87981: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:55 -0400 (0:00:00.170) 0:00:38.159 ****** 11792 1727096155.88012: entering _queue_task() for managed_node2/fail 11792 1727096155.88772: worker is 1 (out of 1 available) 11792 1727096155.88787: exiting _queue_task() for managed_node2/fail 11792 1727096155.88806: done queuing things up, now waiting for results queue to drain 11792 1727096155.88808: waiting for pending results... 11792 1727096155.89472: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11792 1727096155.89708: in run() - task 0afff68d-5257-d9c7-3fc0-00000000069a 11792 1727096155.89723: variable 'ansible_search_path' from source: unknown 11792 1727096155.89783: variable 'ansible_search_path' from source: unknown 11792 1727096155.89815: calling self._execute() 11792 1727096155.90028: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096155.90074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096155.90077: variable 'omit' from source: magic vars 11792 1727096155.90903: variable 'ansible_distribution_major_version' from source: facts 11792 1727096155.90914: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096155.91207: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096155.91640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096155.94278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096155.94368: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096155.94401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096155.94475: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096155.94481: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096155.94600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.94605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.94626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.94680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.94698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.94804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.94808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.94815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.94870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.94890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.94936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096155.95025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096155.95032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.95035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096155.95057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096155.95474: variable 'network_connections' from source: task vars 11792 1727096155.95477: variable 'port2_profile' from source: play vars 11792 1727096155.95480: variable 'port2_profile' from source: play vars 11792 1727096155.95482: variable 'port1_profile' from source: play vars 11792 1727096155.95588: variable 'port1_profile' from source: play vars 11792 1727096155.95596: variable 'controller_profile' from source: play vars 11792 1727096155.95651: variable 'controller_profile' from source: play vars 11792 1727096155.95835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096155.96263: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096155.96300: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096155.96380: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096155.96409: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096155.96579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096155.96601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096155.96626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096155.96672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096155.96726: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096155.97075: variable 'network_connections' from source: task vars 11792 1727096155.97080: variable 'port2_profile' from source: play vars 11792 1727096155.97161: variable 'port2_profile' from source: play vars 11792 1727096155.97170: variable 'port1_profile' from source: play vars 11792 1727096155.97240: variable 'port1_profile' from source: play vars 11792 1727096155.97243: variable 'controller_profile' from source: play vars 11792 1727096155.97302: variable 'controller_profile' from source: play vars 11792 1727096155.97349: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096155.97360: when evaluation is False, skipping this task 11792 1727096155.97363: _execute() done 11792 1727096155.97366: dumping result to json 11792 1727096155.97370: done dumping result, returning 11792 1727096155.97373: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-00000000069a] 11792 1727096155.97375: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069a 11792 1727096155.97520: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069a 11792 1727096155.97523: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096155.97610: no more pending results, returning what we have 11792 1727096155.97613: results queue empty 11792 1727096155.97614: checking for any_errors_fatal 11792 1727096155.97620: done checking for any_errors_fatal 11792 1727096155.97621: checking for max_fail_percentage 11792 1727096155.97622: done checking for max_fail_percentage 11792 1727096155.97623: checking to see if all hosts have failed and the running result is not ok 11792 1727096155.97624: done checking to see if all hosts have failed 11792 1727096155.97624: getting the remaining hosts for this loop 11792 1727096155.97626: done getting the remaining hosts for this loop 11792 1727096155.97629: getting the next task for host managed_node2 11792 1727096155.97637: done getting next task for host managed_node2 11792 1727096155.97641: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11792 1727096155.97645: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096155.97660: getting variables 11792 1727096155.97661: in VariableManager get_vars() 11792 1727096155.97822: Calling all_inventory to load vars for managed_node2 11792 1727096155.97825: Calling groups_inventory to load vars for managed_node2 11792 1727096155.97827: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096155.97837: Calling all_plugins_play to load vars for managed_node2 11792 1727096155.97840: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096155.97842: Calling groups_plugins_play to load vars for managed_node2 11792 1727096155.99249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096156.01702: done with get_vars() 11792 1727096156.01746: done getting variables 11792 1727096156.01811: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:56 -0400 (0:00:00.138) 0:00:38.297 ****** 11792 1727096156.01859: entering _queue_task() for managed_node2/package 11792 1727096156.02256: worker is 1 (out of 1 available) 11792 1727096156.02381: exiting _queue_task() for managed_node2/package 11792 1727096156.02395: done queuing things up, now waiting for results queue to drain 11792 1727096156.02397: waiting for pending results... 11792 1727096156.02686: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11792 1727096156.02974: in run() - task 0afff68d-5257-d9c7-3fc0-00000000069b 11792 1727096156.02979: variable 'ansible_search_path' from source: unknown 11792 1727096156.02982: variable 'ansible_search_path' from source: unknown 11792 1727096156.02985: calling self._execute() 11792 1727096156.02988: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096156.02991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096156.02994: variable 'omit' from source: magic vars 11792 1727096156.03474: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.03479: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096156.03775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096156.03916: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096156.03966: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096156.04000: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096156.04090: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096156.04208: variable 'network_packages' from source: role '' defaults 11792 1727096156.04321: variable '__network_provider_setup' from source: role '' defaults 11792 1727096156.04472: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096156.04475: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096156.04478: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096156.04489: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096156.04683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096156.07375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096156.07380: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096156.07383: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096156.07385: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096156.07390: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096156.07485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.07524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.07549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.07595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.07608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.07664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.07689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.07712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.07760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.07776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.08073: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11792 1727096156.08154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.08208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.08232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.08273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.08298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.08419: variable 'ansible_python' from source: facts 11792 1727096156.08437: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11792 1727096156.08694: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096156.08777: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096156.09084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.09107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.09133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.09306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.09322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.09366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.09505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.09531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.09708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.09726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.09924: variable 'network_connections' from source: task vars 11792 1727096156.09948: variable 'port2_profile' from source: play vars 11792 1727096156.10063: variable 'port2_profile' from source: play vars 11792 1727096156.10088: variable 'port1_profile' from source: play vars 11792 1727096156.10200: variable 'port1_profile' from source: play vars 11792 1727096156.10210: variable 'controller_profile' from source: play vars 11792 1727096156.10382: variable 'controller_profile' from source: play vars 11792 1727096156.10406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096156.10432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096156.10481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.10513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096156.10580: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096156.10881: variable 'network_connections' from source: task vars 11792 1727096156.10893: variable 'port2_profile' from source: play vars 11792 1727096156.10996: variable 'port2_profile' from source: play vars 11792 1727096156.11012: variable 'port1_profile' from source: play vars 11792 1727096156.11120: variable 'port1_profile' from source: play vars 11792 1727096156.11130: variable 'controller_profile' from source: play vars 11792 1727096156.11236: variable 'controller_profile' from source: play vars 11792 1727096156.11273: variable '__network_packages_default_wireless' from source: role '' defaults 11792 1727096156.11361: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096156.11678: variable 'network_connections' from source: task vars 11792 1727096156.11681: variable 'port2_profile' from source: play vars 11792 1727096156.11835: variable 'port2_profile' from source: play vars 11792 1727096156.11838: variable 'port1_profile' from source: play vars 11792 1727096156.11840: variable 'port1_profile' from source: play vars 11792 1727096156.11847: variable 'controller_profile' from source: play vars 11792 1727096156.11953: variable 'controller_profile' from source: play vars 11792 1727096156.11994: variable '__network_packages_default_team' from source: role '' defaults 11792 1727096156.12414: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096156.12731: variable 'network_connections' from source: task vars 11792 1727096156.12734: variable 'port2_profile' from source: play vars 11792 1727096156.12821: variable 'port2_profile' from source: play vars 11792 1727096156.12830: variable 'port1_profile' from source: play vars 11792 1727096156.13012: variable 'port1_profile' from source: play vars 11792 1727096156.13020: variable 'controller_profile' from source: play vars 11792 1727096156.13128: variable 'controller_profile' from source: play vars 11792 1727096156.13190: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096156.13369: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096156.13376: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096156.13612: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096156.13984: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11792 1727096156.14541: variable 'network_connections' from source: task vars 11792 1727096156.14545: variable 'port2_profile' from source: play vars 11792 1727096156.14618: variable 'port2_profile' from source: play vars 11792 1727096156.14626: variable 'port1_profile' from source: play vars 11792 1727096156.14700: variable 'port1_profile' from source: play vars 11792 1727096156.14708: variable 'controller_profile' from source: play vars 11792 1727096156.14777: variable 'controller_profile' from source: play vars 11792 1727096156.14786: variable 'ansible_distribution' from source: facts 11792 1727096156.14789: variable '__network_rh_distros' from source: role '' defaults 11792 1727096156.14796: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.14819: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11792 1727096156.14995: variable 'ansible_distribution' from source: facts 11792 1727096156.14998: variable '__network_rh_distros' from source: role '' defaults 11792 1727096156.15004: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.15025: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11792 1727096156.15186: variable 'ansible_distribution' from source: facts 11792 1727096156.15190: variable '__network_rh_distros' from source: role '' defaults 11792 1727096156.15194: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.15228: variable 'network_provider' from source: set_fact 11792 1727096156.15249: variable 'ansible_facts' from source: unknown 11792 1727096156.15954: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11792 1727096156.15958: when evaluation is False, skipping this task 11792 1727096156.15960: _execute() done 11792 1727096156.15963: dumping result to json 11792 1727096156.15966: done dumping result, returning 11792 1727096156.15977: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-d9c7-3fc0-00000000069b] 11792 1727096156.15980: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069b skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11792 1727096156.16137: no more pending results, returning what we have 11792 1727096156.16140: results queue empty 11792 1727096156.16141: checking for any_errors_fatal 11792 1727096156.16148: done checking for any_errors_fatal 11792 1727096156.16149: checking for max_fail_percentage 11792 1727096156.16151: done checking for max_fail_percentage 11792 1727096156.16151: checking to see if all hosts have failed and the running result is not ok 11792 1727096156.16230: done checking to see if all hosts have failed 11792 1727096156.16231: getting the remaining hosts for this loop 11792 1727096156.16233: done getting the remaining hosts for this loop 11792 1727096156.16237: getting the next task for host managed_node2 11792 1727096156.16251: done getting next task for host managed_node2 11792 1727096156.16260: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11792 1727096156.16265: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096156.16384: getting variables 11792 1727096156.16385: in VariableManager get_vars() 11792 1727096156.16462: Calling all_inventory to load vars for managed_node2 11792 1727096156.16465: Calling groups_inventory to load vars for managed_node2 11792 1727096156.16469: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096156.16478: Calling all_plugins_play to load vars for managed_node2 11792 1727096156.16481: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096156.16484: Calling groups_plugins_play to load vars for managed_node2 11792 1727096156.17182: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069b 11792 1727096156.17186: WORKER PROCESS EXITING 11792 1727096156.17702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096156.19503: done with get_vars() 11792 1727096156.19528: done getting variables 11792 1727096156.19596: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:56 -0400 (0:00:00.177) 0:00:38.475 ****** 11792 1727096156.19624: entering _queue_task() for managed_node2/package 11792 1727096156.19911: worker is 1 (out of 1 available) 11792 1727096156.19925: exiting _queue_task() for managed_node2/package 11792 1727096156.19939: done queuing things up, now waiting for results queue to drain 11792 1727096156.19941: waiting for pending results... 11792 1727096156.20133: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11792 1727096156.20242: in run() - task 0afff68d-5257-d9c7-3fc0-00000000069c 11792 1727096156.20258: variable 'ansible_search_path' from source: unknown 11792 1727096156.20262: variable 'ansible_search_path' from source: unknown 11792 1727096156.20301: calling self._execute() 11792 1727096156.20475: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096156.20479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096156.20482: variable 'omit' from source: magic vars 11792 1727096156.20746: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.20757: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096156.20872: variable 'network_state' from source: role '' defaults 11792 1727096156.20884: Evaluated conditional (network_state != {}): False 11792 1727096156.20888: when evaluation is False, skipping this task 11792 1727096156.20891: _execute() done 11792 1727096156.20894: dumping result to json 11792 1727096156.20896: done dumping result, returning 11792 1727096156.20925: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-d9c7-3fc0-00000000069c] 11792 1727096156.20929: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096156.21082: no more pending results, returning what we have 11792 1727096156.21086: results queue empty 11792 1727096156.21087: checking for any_errors_fatal 11792 1727096156.21095: done checking for any_errors_fatal 11792 1727096156.21096: checking for max_fail_percentage 11792 1727096156.21098: done checking for max_fail_percentage 11792 1727096156.21099: checking to see if all hosts have failed and the running result is not ok 11792 1727096156.21100: done checking to see if all hosts have failed 11792 1727096156.21100: getting the remaining hosts for this loop 11792 1727096156.21102: done getting the remaining hosts for this loop 11792 1727096156.21105: getting the next task for host managed_node2 11792 1727096156.21114: done getting next task for host managed_node2 11792 1727096156.21117: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11792 1727096156.21123: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096156.21143: getting variables 11792 1727096156.21145: in VariableManager get_vars() 11792 1727096156.21189: Calling all_inventory to load vars for managed_node2 11792 1727096156.21192: Calling groups_inventory to load vars for managed_node2 11792 1727096156.21194: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096156.21203: Calling all_plugins_play to load vars for managed_node2 11792 1727096156.21206: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096156.21208: Calling groups_plugins_play to load vars for managed_node2 11792 1727096156.21783: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069c 11792 1727096156.21790: WORKER PROCESS EXITING 11792 1727096156.22514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096156.24013: done with get_vars() 11792 1727096156.24036: done getting variables 11792 1727096156.24085: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:56 -0400 (0:00:00.044) 0:00:38.520 ****** 11792 1727096156.24115: entering _queue_task() for managed_node2/package 11792 1727096156.24391: worker is 1 (out of 1 available) 11792 1727096156.24404: exiting _queue_task() for managed_node2/package 11792 1727096156.24416: done queuing things up, now waiting for results queue to drain 11792 1727096156.24418: waiting for pending results... 11792 1727096156.24609: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11792 1727096156.24715: in run() - task 0afff68d-5257-d9c7-3fc0-00000000069d 11792 1727096156.24727: variable 'ansible_search_path' from source: unknown 11792 1727096156.24731: variable 'ansible_search_path' from source: unknown 11792 1727096156.24762: calling self._execute() 11792 1727096156.24835: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096156.24841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096156.24849: variable 'omit' from source: magic vars 11792 1727096156.25133: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.25143: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096156.25234: variable 'network_state' from source: role '' defaults 11792 1727096156.25244: Evaluated conditional (network_state != {}): False 11792 1727096156.25247: when evaluation is False, skipping this task 11792 1727096156.25249: _execute() done 11792 1727096156.25255: dumping result to json 11792 1727096156.25258: done dumping result, returning 11792 1727096156.25263: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-d9c7-3fc0-00000000069d] 11792 1727096156.25270: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069d 11792 1727096156.25371: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069d 11792 1727096156.25375: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096156.25451: no more pending results, returning what we have 11792 1727096156.25460: results queue empty 11792 1727096156.25461: checking for any_errors_fatal 11792 1727096156.25486: done checking for any_errors_fatal 11792 1727096156.25487: checking for max_fail_percentage 11792 1727096156.25489: done checking for max_fail_percentage 11792 1727096156.25490: checking to see if all hosts have failed and the running result is not ok 11792 1727096156.25490: done checking to see if all hosts have failed 11792 1727096156.25491: getting the remaining hosts for this loop 11792 1727096156.25493: done getting the remaining hosts for this loop 11792 1727096156.25496: getting the next task for host managed_node2 11792 1727096156.25504: done getting next task for host managed_node2 11792 1727096156.25508: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11792 1727096156.25513: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096156.25528: getting variables 11792 1727096156.25530: in VariableManager get_vars() 11792 1727096156.25564: Calling all_inventory to load vars for managed_node2 11792 1727096156.25572: Calling groups_inventory to load vars for managed_node2 11792 1727096156.25574: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096156.25587: Calling all_plugins_play to load vars for managed_node2 11792 1727096156.25590: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096156.25592: Calling groups_plugins_play to load vars for managed_node2 11792 1727096156.26922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096156.28418: done with get_vars() 11792 1727096156.28447: done getting variables 11792 1727096156.28509: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:56 -0400 (0:00:00.044) 0:00:38.564 ****** 11792 1727096156.28544: entering _queue_task() for managed_node2/service 11792 1727096156.28908: worker is 1 (out of 1 available) 11792 1727096156.28920: exiting _queue_task() for managed_node2/service 11792 1727096156.28933: done queuing things up, now waiting for results queue to drain 11792 1727096156.28934: waiting for pending results... 11792 1727096156.29275: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11792 1727096156.29414: in run() - task 0afff68d-5257-d9c7-3fc0-00000000069e 11792 1727096156.29430: variable 'ansible_search_path' from source: unknown 11792 1727096156.29433: variable 'ansible_search_path' from source: unknown 11792 1727096156.29482: calling self._execute() 11792 1727096156.29614: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096156.29620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096156.29624: variable 'omit' from source: magic vars 11792 1727096156.30093: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.30097: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096156.30193: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096156.30328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096156.32800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096156.32909: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096156.32962: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096156.32982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096156.33021: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096156.33109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.33147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.33180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.33211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.33241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.33277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.33296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.33325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.33354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.33368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.33414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.33471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.33484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.33517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.33529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.33675: variable 'network_connections' from source: task vars 11792 1727096156.33694: variable 'port2_profile' from source: play vars 11792 1727096156.33743: variable 'port2_profile' from source: play vars 11792 1727096156.33752: variable 'port1_profile' from source: play vars 11792 1727096156.33801: variable 'port1_profile' from source: play vars 11792 1727096156.33807: variable 'controller_profile' from source: play vars 11792 1727096156.33852: variable 'controller_profile' from source: play vars 11792 1727096156.33925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096156.34107: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096156.34136: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096156.34164: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096156.34206: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096156.34246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096156.34265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096156.34286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.34306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096156.34348: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096156.34592: variable 'network_connections' from source: task vars 11792 1727096156.34595: variable 'port2_profile' from source: play vars 11792 1727096156.34644: variable 'port2_profile' from source: play vars 11792 1727096156.34650: variable 'port1_profile' from source: play vars 11792 1727096156.34696: variable 'port1_profile' from source: play vars 11792 1727096156.34702: variable 'controller_profile' from source: play vars 11792 1727096156.34746: variable 'controller_profile' from source: play vars 11792 1727096156.34771: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096156.34783: when evaluation is False, skipping this task 11792 1727096156.34788: _execute() done 11792 1727096156.34790: dumping result to json 11792 1727096156.34793: done dumping result, returning 11792 1727096156.34795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-00000000069e] 11792 1727096156.34797: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096156.34931: no more pending results, returning what we have 11792 1727096156.34935: results queue empty 11792 1727096156.34936: checking for any_errors_fatal 11792 1727096156.34942: done checking for any_errors_fatal 11792 1727096156.34943: checking for max_fail_percentage 11792 1727096156.34944: done checking for max_fail_percentage 11792 1727096156.34945: checking to see if all hosts have failed and the running result is not ok 11792 1727096156.34946: done checking to see if all hosts have failed 11792 1727096156.34946: getting the remaining hosts for this loop 11792 1727096156.34948: done getting the remaining hosts for this loop 11792 1727096156.34951: getting the next task for host managed_node2 11792 1727096156.34958: done getting next task for host managed_node2 11792 1727096156.34962: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11792 1727096156.34968: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096156.34986: getting variables 11792 1727096156.34987: in VariableManager get_vars() 11792 1727096156.35024: Calling all_inventory to load vars for managed_node2 11792 1727096156.35027: Calling groups_inventory to load vars for managed_node2 11792 1727096156.35030: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096156.35040: Calling all_plugins_play to load vars for managed_node2 11792 1727096156.35043: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096156.35045: Calling groups_plugins_play to load vars for managed_node2 11792 1727096156.35582: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069e 11792 1727096156.35586: WORKER PROCESS EXITING 11792 1727096156.35907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096156.37457: done with get_vars() 11792 1727096156.37499: done getting variables 11792 1727096156.37617: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:56 -0400 (0:00:00.092) 0:00:38.656 ****** 11792 1727096156.37783: entering _queue_task() for managed_node2/service 11792 1727096156.38273: worker is 1 (out of 1 available) 11792 1727096156.38287: exiting _queue_task() for managed_node2/service 11792 1727096156.38300: done queuing things up, now waiting for results queue to drain 11792 1727096156.38302: waiting for pending results... 11792 1727096156.38618: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11792 1727096156.38785: in run() - task 0afff68d-5257-d9c7-3fc0-00000000069f 11792 1727096156.38798: variable 'ansible_search_path' from source: unknown 11792 1727096156.38806: variable 'ansible_search_path' from source: unknown 11792 1727096156.38833: calling self._execute() 11792 1727096156.38964: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096156.38970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096156.38973: variable 'omit' from source: magic vars 11792 1727096156.39349: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.39353: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096156.39499: variable 'network_provider' from source: set_fact 11792 1727096156.39503: variable 'network_state' from source: role '' defaults 11792 1727096156.39513: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11792 1727096156.39519: variable 'omit' from source: magic vars 11792 1727096156.39598: variable 'omit' from source: magic vars 11792 1727096156.39643: variable 'network_service_name' from source: role '' defaults 11792 1727096156.39686: variable 'network_service_name' from source: role '' defaults 11792 1727096156.39769: variable '__network_provider_setup' from source: role '' defaults 11792 1727096156.39773: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096156.39866: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096156.39873: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096156.39938: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096156.40145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096156.42899: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096156.43001: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096156.43021: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096156.43049: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096156.43091: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096156.43164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.43207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.43379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.43383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.43386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.43388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.43390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.43392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.43446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.43449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.43718: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11792 1727096156.43859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.43891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.43922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.43962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.43984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.44074: variable 'ansible_python' from source: facts 11792 1727096156.44097: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11792 1727096156.44309: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096156.44313: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096156.44489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.44497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.44500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.44653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.44690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.44950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096156.45193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096156.45196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.45199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096156.45202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096156.45321: variable 'network_connections' from source: task vars 11792 1727096156.45329: variable 'port2_profile' from source: play vars 11792 1727096156.45538: variable 'port2_profile' from source: play vars 11792 1727096156.45541: variable 'port1_profile' from source: play vars 11792 1727096156.45647: variable 'port1_profile' from source: play vars 11792 1727096156.45650: variable 'controller_profile' from source: play vars 11792 1727096156.45898: variable 'controller_profile' from source: play vars 11792 1727096156.46189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096156.46578: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096156.46581: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096156.46586: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096156.46650: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096156.46878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096156.46883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096156.46888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096156.46891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096156.46894: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096156.47324: variable 'network_connections' from source: task vars 11792 1727096156.47331: variable 'port2_profile' from source: play vars 11792 1727096156.47334: variable 'port2_profile' from source: play vars 11792 1727096156.47336: variable 'port1_profile' from source: play vars 11792 1727096156.47386: variable 'port1_profile' from source: play vars 11792 1727096156.47394: variable 'controller_profile' from source: play vars 11792 1727096156.47472: variable 'controller_profile' from source: play vars 11792 1727096156.47511: variable '__network_packages_default_wireless' from source: role '' defaults 11792 1727096156.47591: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096156.47926: variable 'network_connections' from source: task vars 11792 1727096156.47929: variable 'port2_profile' from source: play vars 11792 1727096156.48008: variable 'port2_profile' from source: play vars 11792 1727096156.48015: variable 'port1_profile' from source: play vars 11792 1727096156.48087: variable 'port1_profile' from source: play vars 11792 1727096156.48095: variable 'controller_profile' from source: play vars 11792 1727096156.48321: variable 'controller_profile' from source: play vars 11792 1727096156.48340: variable '__network_packages_default_team' from source: role '' defaults 11792 1727096156.48740: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096156.48852: variable 'network_connections' from source: task vars 11792 1727096156.48866: variable 'port2_profile' from source: play vars 11792 1727096156.48934: variable 'port2_profile' from source: play vars 11792 1727096156.48943: variable 'port1_profile' from source: play vars 11792 1727096156.49078: variable 'port1_profile' from source: play vars 11792 1727096156.49082: variable 'controller_profile' from source: play vars 11792 1727096156.49130: variable 'controller_profile' from source: play vars 11792 1727096156.49183: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096156.49243: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096156.49251: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096156.49331: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096156.49986: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11792 1727096156.50717: variable 'network_connections' from source: task vars 11792 1727096156.50721: variable 'port2_profile' from source: play vars 11792 1727096156.50781: variable 'port2_profile' from source: play vars 11792 1727096156.50789: variable 'port1_profile' from source: play vars 11792 1727096156.50920: variable 'port1_profile' from source: play vars 11792 1727096156.50923: variable 'controller_profile' from source: play vars 11792 1727096156.50926: variable 'controller_profile' from source: play vars 11792 1727096156.50930: variable 'ansible_distribution' from source: facts 11792 1727096156.50932: variable '__network_rh_distros' from source: role '' defaults 11792 1727096156.50934: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.50972: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11792 1727096156.51158: variable 'ansible_distribution' from source: facts 11792 1727096156.51162: variable '__network_rh_distros' from source: role '' defaults 11792 1727096156.51165: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.51196: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11792 1727096156.51382: variable 'ansible_distribution' from source: facts 11792 1727096156.51385: variable '__network_rh_distros' from source: role '' defaults 11792 1727096156.51390: variable 'ansible_distribution_major_version' from source: facts 11792 1727096156.51459: variable 'network_provider' from source: set_fact 11792 1727096156.51462: variable 'omit' from source: magic vars 11792 1727096156.51485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096156.51580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096156.51585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096156.51588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096156.51590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096156.51592: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096156.51596: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096156.51598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096156.51727: Set connection var ansible_timeout to 10 11792 1727096156.51730: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096156.51732: Set connection var ansible_shell_executable to /bin/sh 11792 1727096156.51734: Set connection var ansible_pipelining to False 11792 1727096156.51736: Set connection var ansible_shell_type to sh 11792 1727096156.51738: Set connection var ansible_connection to ssh 11792 1727096156.51771: variable 'ansible_shell_executable' from source: unknown 11792 1727096156.51774: variable 'ansible_connection' from source: unknown 11792 1727096156.51827: variable 'ansible_module_compression' from source: unknown 11792 1727096156.51831: variable 'ansible_shell_type' from source: unknown 11792 1727096156.51833: variable 'ansible_shell_executable' from source: unknown 11792 1727096156.51836: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096156.51838: variable 'ansible_pipelining' from source: unknown 11792 1727096156.51840: variable 'ansible_timeout' from source: unknown 11792 1727096156.51845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096156.51918: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096156.51979: variable 'omit' from source: magic vars 11792 1727096156.51983: starting attempt loop 11792 1727096156.51986: running the handler 11792 1727096156.52056: variable 'ansible_facts' from source: unknown 11792 1727096156.52817: _low_level_execute_command(): starting 11792 1727096156.52820: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096156.53525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096156.53648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096156.53654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096156.53656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096156.53658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096156.53662: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096156.53665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096156.53667: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096156.53671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096156.53673: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096156.53695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096156.53721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096156.53734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096156.53743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096156.53829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096156.55572: stdout chunk (state=3): >>>/root <<< 11792 1727096156.55806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096156.55810: stdout chunk (state=3): >>><<< 11792 1727096156.55813: stderr chunk (state=3): >>><<< 11792 1727096156.55937: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096156.55940: _low_level_execute_command(): starting 11792 1727096156.55943: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689 `" && echo ansible-tmp-1727096156.558354-13638-266371287414689="` echo /root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689 `" ) && sleep 0' 11792 1727096156.57002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096156.57007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096156.57132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096156.57185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096156.57264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096156.57327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096156.59391: stdout chunk (state=3): >>>ansible-tmp-1727096156.558354-13638-266371287414689=/root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689 <<< 11792 1727096156.59534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096156.59578: stderr chunk (state=3): >>><<< 11792 1727096156.59582: stdout chunk (state=3): >>><<< 11792 1727096156.59585: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096156.558354-13638-266371287414689=/root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096156.59638: variable 'ansible_module_compression' from source: unknown 11792 1727096156.59678: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11792 1727096156.59731: variable 'ansible_facts' from source: unknown 11792 1727096156.59876: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/AnsiballZ_systemd.py 11792 1727096156.60071: Sending initial data 11792 1727096156.60075: Sent initial data (155 bytes) 11792 1727096156.60428: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096156.60431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096156.60434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096156.60436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096156.60438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096156.60491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096156.60500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096156.60504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096156.60536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096156.62222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096156.62282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096156.62313: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpbug5t2fq /root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/AnsiballZ_systemd.py <<< 11792 1727096156.62317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/AnsiballZ_systemd.py" <<< 11792 1727096156.62351: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpbug5t2fq" to remote "/root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/AnsiballZ_systemd.py" <<< 11792 1727096156.63594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096156.63671: stderr chunk (state=3): >>><<< 11792 1727096156.63674: stdout chunk (state=3): >>><<< 11792 1727096156.63677: done transferring module to remote 11792 1727096156.63687: _low_level_execute_command(): starting 11792 1727096156.63692: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/ /root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/AnsiballZ_systemd.py && sleep 0' 11792 1727096156.64189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096156.64196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096156.64232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096156.66071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096156.66160: stderr chunk (state=3): >>><<< 11792 1727096156.66163: stdout chunk (state=3): >>><<< 11792 1727096156.66166: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096156.66170: _low_level_execute_command(): starting 11792 1727096156.66172: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/AnsiballZ_systemd.py && sleep 0' 11792 1727096156.66558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096156.66562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096156.66564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096156.66573: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096156.66575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096156.66623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096156.66628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096156.66634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096156.66666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096156.96542: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4452352", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296317440", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "512631000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 11792 1727096156.96577: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "syste<<< 11792 1727096156.96648: stdout chunk (state=3): >>>md-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11792 1727096156.98581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096156.98677: stderr chunk (state=3): >>><<< 11792 1727096156.98680: stdout chunk (state=3): >>><<< 11792 1727096156.98701: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4452352", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296317440", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "512631000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096156.98848: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096156.98976: _low_level_execute_command(): starting 11792 1727096156.98979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096156.558354-13638-266371287414689/ > /dev/null 2>&1 && sleep 0' 11792 1727096156.99465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096156.99485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096156.99497: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096156.99538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096156.99551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096156.99632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096157.01549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096157.01585: stderr chunk (state=3): >>><<< 11792 1727096157.01595: stdout chunk (state=3): >>><<< 11792 1727096157.01608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096157.01617: handler run complete 11792 1727096157.01659: attempt loop complete, returning result 11792 1727096157.01662: _execute() done 11792 1727096157.01665: dumping result to json 11792 1727096157.01679: done dumping result, returning 11792 1727096157.01689: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-d9c7-3fc0-00000000069f] 11792 1727096157.01693: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069f ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096157.02003: no more pending results, returning what we have 11792 1727096157.02007: results queue empty 11792 1727096157.02008: checking for any_errors_fatal 11792 1727096157.02013: done checking for any_errors_fatal 11792 1727096157.02014: checking for max_fail_percentage 11792 1727096157.02016: done checking for max_fail_percentage 11792 1727096157.02016: checking to see if all hosts have failed and the running result is not ok 11792 1727096157.02017: done checking to see if all hosts have failed 11792 1727096157.02018: getting the remaining hosts for this loop 11792 1727096157.02019: done getting the remaining hosts for this loop 11792 1727096157.02022: getting the next task for host managed_node2 11792 1727096157.02028: done getting next task for host managed_node2 11792 1727096157.02031: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11792 1727096157.02036: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096157.02045: getting variables 11792 1727096157.02046: in VariableManager get_vars() 11792 1727096157.02083: Calling all_inventory to load vars for managed_node2 11792 1727096157.02087: Calling groups_inventory to load vars for managed_node2 11792 1727096157.02089: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096157.02098: Calling all_plugins_play to load vars for managed_node2 11792 1727096157.02101: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096157.02103: Calling groups_plugins_play to load vars for managed_node2 11792 1727096157.02682: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000069f 11792 1727096157.02686: WORKER PROCESS EXITING 11792 1727096157.02929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096157.04114: done with get_vars() 11792 1727096157.04139: done getting variables 11792 1727096157.04203: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:57 -0400 (0:00:00.664) 0:00:39.321 ****** 11792 1727096157.04247: entering _queue_task() for managed_node2/service 11792 1727096157.04592: worker is 1 (out of 1 available) 11792 1727096157.04604: exiting _queue_task() for managed_node2/service 11792 1727096157.04619: done queuing things up, now waiting for results queue to drain 11792 1727096157.04620: waiting for pending results... 11792 1727096157.04996: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11792 1727096157.05101: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a0 11792 1727096157.05121: variable 'ansible_search_path' from source: unknown 11792 1727096157.05129: variable 'ansible_search_path' from source: unknown 11792 1727096157.05214: calling self._execute() 11792 1727096157.05306: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096157.05324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096157.05473: variable 'omit' from source: magic vars 11792 1727096157.05733: variable 'ansible_distribution_major_version' from source: facts 11792 1727096157.05751: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096157.05882: variable 'network_provider' from source: set_fact 11792 1727096157.05893: Evaluated conditional (network_provider == "nm"): True 11792 1727096157.05999: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096157.06097: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096157.06292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096157.08732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096157.08806: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096157.09072: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096157.09076: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096157.09078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096157.09081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096157.09084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096157.09086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096157.09113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096157.09133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096157.09189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096157.09222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096157.09251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096157.09300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096157.09325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096157.09374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096157.09402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096157.09437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096157.09484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096157.09503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096157.09669: variable 'network_connections' from source: task vars 11792 1727096157.09689: variable 'port2_profile' from source: play vars 11792 1727096157.09766: variable 'port2_profile' from source: play vars 11792 1727096157.09785: variable 'port1_profile' from source: play vars 11792 1727096157.09857: variable 'port1_profile' from source: play vars 11792 1727096157.09873: variable 'controller_profile' from source: play vars 11792 1727096157.09934: variable 'controller_profile' from source: play vars 11792 1727096157.10029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096157.10207: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096157.10248: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096157.10399: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096157.10401: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096157.10403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096157.10405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096157.10407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096157.10430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096157.10495: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096157.10763: variable 'network_connections' from source: task vars 11792 1727096157.10775: variable 'port2_profile' from source: play vars 11792 1727096157.10862: variable 'port2_profile' from source: play vars 11792 1727096157.10878: variable 'port1_profile' from source: play vars 11792 1727096157.10944: variable 'port1_profile' from source: play vars 11792 1727096157.10972: variable 'controller_profile' from source: play vars 11792 1727096157.11033: variable 'controller_profile' from source: play vars 11792 1727096157.11079: Evaluated conditional (__network_wpa_supplicant_required): False 11792 1727096157.11088: when evaluation is False, skipping this task 11792 1727096157.11097: _execute() done 11792 1727096157.11104: dumping result to json 11792 1727096157.11111: done dumping result, returning 11792 1727096157.11123: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-d9c7-3fc0-0000000006a0] 11792 1727096157.11131: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a0 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11792 1727096157.11318: no more pending results, returning what we have 11792 1727096157.11322: results queue empty 11792 1727096157.11323: checking for any_errors_fatal 11792 1727096157.11344: done checking for any_errors_fatal 11792 1727096157.11345: checking for max_fail_percentage 11792 1727096157.11347: done checking for max_fail_percentage 11792 1727096157.11348: checking to see if all hosts have failed and the running result is not ok 11792 1727096157.11349: done checking to see if all hosts have failed 11792 1727096157.11350: getting the remaining hosts for this loop 11792 1727096157.11354: done getting the remaining hosts for this loop 11792 1727096157.11359: getting the next task for host managed_node2 11792 1727096157.11370: done getting next task for host managed_node2 11792 1727096157.11374: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11792 1727096157.11380: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096157.11399: getting variables 11792 1727096157.11401: in VariableManager get_vars() 11792 1727096157.11444: Calling all_inventory to load vars for managed_node2 11792 1727096157.11447: Calling groups_inventory to load vars for managed_node2 11792 1727096157.11449: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096157.11464: Calling all_plugins_play to load vars for managed_node2 11792 1727096157.11770: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096157.11777: Calling groups_plugins_play to load vars for managed_node2 11792 1727096157.12481: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a0 11792 1727096157.12485: WORKER PROCESS EXITING 11792 1727096157.13305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096157.14857: done with get_vars() 11792 1727096157.15112: done getting variables 11792 1727096157.15180: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:57 -0400 (0:00:00.109) 0:00:39.431 ****** 11792 1727096157.15216: entering _queue_task() for managed_node2/service 11792 1727096157.15963: worker is 1 (out of 1 available) 11792 1727096157.15977: exiting _queue_task() for managed_node2/service 11792 1727096157.15990: done queuing things up, now waiting for results queue to drain 11792 1727096157.15992: waiting for pending results... 11792 1727096157.16398: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11792 1727096157.16524: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a1 11792 1727096157.16546: variable 'ansible_search_path' from source: unknown 11792 1727096157.16550: variable 'ansible_search_path' from source: unknown 11792 1727096157.16582: calling self._execute() 11792 1727096157.16652: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096157.16660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096157.16671: variable 'omit' from source: magic vars 11792 1727096157.16938: variable 'ansible_distribution_major_version' from source: facts 11792 1727096157.16947: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096157.17031: variable 'network_provider' from source: set_fact 11792 1727096157.17035: Evaluated conditional (network_provider == "initscripts"): False 11792 1727096157.17037: when evaluation is False, skipping this task 11792 1727096157.17040: _execute() done 11792 1727096157.17043: dumping result to json 11792 1727096157.17046: done dumping result, returning 11792 1727096157.17054: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-d9c7-3fc0-0000000006a1] 11792 1727096157.17061: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a1 11792 1727096157.17148: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a1 11792 1727096157.17151: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096157.17216: no more pending results, returning what we have 11792 1727096157.17220: results queue empty 11792 1727096157.17221: checking for any_errors_fatal 11792 1727096157.17230: done checking for any_errors_fatal 11792 1727096157.17231: checking for max_fail_percentage 11792 1727096157.17233: done checking for max_fail_percentage 11792 1727096157.17234: checking to see if all hosts have failed and the running result is not ok 11792 1727096157.17234: done checking to see if all hosts have failed 11792 1727096157.17235: getting the remaining hosts for this loop 11792 1727096157.17236: done getting the remaining hosts for this loop 11792 1727096157.17240: getting the next task for host managed_node2 11792 1727096157.17247: done getting next task for host managed_node2 11792 1727096157.17251: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11792 1727096157.17256: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096157.17273: getting variables 11792 1727096157.17274: in VariableManager get_vars() 11792 1727096157.17306: Calling all_inventory to load vars for managed_node2 11792 1727096157.17308: Calling groups_inventory to load vars for managed_node2 11792 1727096157.17310: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096157.17318: Calling all_plugins_play to load vars for managed_node2 11792 1727096157.17321: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096157.17323: Calling groups_plugins_play to load vars for managed_node2 11792 1727096157.18287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096157.20611: done with get_vars() 11792 1727096157.20642: done getting variables 11792 1727096157.20718: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:57 -0400 (0:00:00.055) 0:00:39.486 ****** 11792 1727096157.20761: entering _queue_task() for managed_node2/copy 11792 1727096157.21134: worker is 1 (out of 1 available) 11792 1727096157.21147: exiting _queue_task() for managed_node2/copy 11792 1727096157.21163: done queuing things up, now waiting for results queue to drain 11792 1727096157.21164: waiting for pending results... 11792 1727096157.21538: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11792 1727096157.21620: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a2 11792 1727096157.21651: variable 'ansible_search_path' from source: unknown 11792 1727096157.21664: variable 'ansible_search_path' from source: unknown 11792 1727096157.21705: calling self._execute() 11792 1727096157.21813: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096157.21825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096157.21840: variable 'omit' from source: magic vars 11792 1727096157.22236: variable 'ansible_distribution_major_version' from source: facts 11792 1727096157.22255: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096157.22397: variable 'network_provider' from source: set_fact 11792 1727096157.22401: Evaluated conditional (network_provider == "initscripts"): False 11792 1727096157.22404: when evaluation is False, skipping this task 11792 1727096157.22406: _execute() done 11792 1727096157.22408: dumping result to json 11792 1727096157.22472: done dumping result, returning 11792 1727096157.22476: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-d9c7-3fc0-0000000006a2] 11792 1727096157.22479: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a2 11792 1727096157.22773: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a2 11792 1727096157.22777: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11792 1727096157.22827: no more pending results, returning what we have 11792 1727096157.22831: results queue empty 11792 1727096157.22832: checking for any_errors_fatal 11792 1727096157.22836: done checking for any_errors_fatal 11792 1727096157.22837: checking for max_fail_percentage 11792 1727096157.22839: done checking for max_fail_percentage 11792 1727096157.22840: checking to see if all hosts have failed and the running result is not ok 11792 1727096157.22840: done checking to see if all hosts have failed 11792 1727096157.22841: getting the remaining hosts for this loop 11792 1727096157.22843: done getting the remaining hosts for this loop 11792 1727096157.22846: getting the next task for host managed_node2 11792 1727096157.22854: done getting next task for host managed_node2 11792 1727096157.22858: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11792 1727096157.22862: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096157.22880: getting variables 11792 1727096157.22881: in VariableManager get_vars() 11792 1727096157.22914: Calling all_inventory to load vars for managed_node2 11792 1727096157.22917: Calling groups_inventory to load vars for managed_node2 11792 1727096157.22919: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096157.22928: Calling all_plugins_play to load vars for managed_node2 11792 1727096157.22931: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096157.22934: Calling groups_plugins_play to load vars for managed_node2 11792 1727096157.24552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096157.25410: done with get_vars() 11792 1727096157.25425: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:57 -0400 (0:00:00.047) 0:00:39.534 ****** 11792 1727096157.25512: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11792 1727096157.25794: worker is 1 (out of 1 available) 11792 1727096157.25806: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11792 1727096157.25818: done queuing things up, now waiting for results queue to drain 11792 1727096157.25820: waiting for pending results... 11792 1727096157.26315: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11792 1727096157.26527: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a3 11792 1727096157.26561: variable 'ansible_search_path' from source: unknown 11792 1727096157.26571: variable 'ansible_search_path' from source: unknown 11792 1727096157.26623: calling self._execute() 11792 1727096157.26727: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096157.26742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096157.26770: variable 'omit' from source: magic vars 11792 1727096157.27178: variable 'ansible_distribution_major_version' from source: facts 11792 1727096157.27197: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096157.27210: variable 'omit' from source: magic vars 11792 1727096157.27285: variable 'omit' from source: magic vars 11792 1727096157.27530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096157.29462: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096157.29512: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096157.29547: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096157.29578: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096157.29651: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096157.29700: variable 'network_provider' from source: set_fact 11792 1727096157.29895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096157.29901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096157.29904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096157.29906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096157.29921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096157.29995: variable 'omit' from source: magic vars 11792 1727096157.30098: variable 'omit' from source: magic vars 11792 1727096157.30200: variable 'network_connections' from source: task vars 11792 1727096157.30211: variable 'port2_profile' from source: play vars 11792 1727096157.30272: variable 'port2_profile' from source: play vars 11792 1727096157.30334: variable 'port1_profile' from source: play vars 11792 1727096157.30337: variable 'port1_profile' from source: play vars 11792 1727096157.30339: variable 'controller_profile' from source: play vars 11792 1727096157.30404: variable 'controller_profile' from source: play vars 11792 1727096157.30564: variable 'omit' from source: magic vars 11792 1727096157.30575: variable '__lsr_ansible_managed' from source: task vars 11792 1727096157.30631: variable '__lsr_ansible_managed' from source: task vars 11792 1727096157.30906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11792 1727096157.31044: Loaded config def from plugin (lookup/template) 11792 1727096157.31048: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11792 1727096157.31079: File lookup term: get_ansible_managed.j2 11792 1727096157.31082: variable 'ansible_search_path' from source: unknown 11792 1727096157.31093: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11792 1727096157.31099: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11792 1727096157.31231: variable 'ansible_search_path' from source: unknown 11792 1727096157.38388: variable 'ansible_managed' from source: unknown 11792 1727096157.38522: variable 'omit' from source: magic vars 11792 1727096157.38558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096157.38585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096157.38609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096157.38633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096157.38645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096157.38728: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096157.38731: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096157.38734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096157.38788: Set connection var ansible_timeout to 10 11792 1727096157.38796: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096157.38822: Set connection var ansible_shell_executable to /bin/sh 11792 1727096157.38826: Set connection var ansible_pipelining to False 11792 1727096157.38828: Set connection var ansible_shell_type to sh 11792 1727096157.38874: Set connection var ansible_connection to ssh 11792 1727096157.38877: variable 'ansible_shell_executable' from source: unknown 11792 1727096157.38880: variable 'ansible_connection' from source: unknown 11792 1727096157.38883: variable 'ansible_module_compression' from source: unknown 11792 1727096157.38885: variable 'ansible_shell_type' from source: unknown 11792 1727096157.38886: variable 'ansible_shell_executable' from source: unknown 11792 1727096157.38888: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096157.38890: variable 'ansible_pipelining' from source: unknown 11792 1727096157.38895: variable 'ansible_timeout' from source: unknown 11792 1727096157.38906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096157.39189: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096157.39193: variable 'omit' from source: magic vars 11792 1727096157.39197: starting attempt loop 11792 1727096157.39200: running the handler 11792 1727096157.39202: _low_level_execute_command(): starting 11792 1727096157.39204: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096157.39755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096157.39764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096157.39841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096157.39844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096157.39909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096157.39932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096157.39994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096157.41670: stdout chunk (state=3): >>>/root <<< 11792 1727096157.41835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096157.41838: stdout chunk (state=3): >>><<< 11792 1727096157.41841: stderr chunk (state=3): >>><<< 11792 1727096157.41966: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096157.41971: _low_level_execute_command(): starting 11792 1727096157.41974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626 `" && echo ansible-tmp-1727096157.4187124-13684-67390228501626="` echo /root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626 `" ) && sleep 0' 11792 1727096157.42550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096157.42576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096157.42622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096157.42640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096157.42664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096157.42739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096157.42787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096157.42821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096157.42876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096157.42990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096157.44906: stdout chunk (state=3): >>>ansible-tmp-1727096157.4187124-13684-67390228501626=/root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626 <<< 11792 1727096157.45137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096157.45173: stderr chunk (state=3): >>><<< 11792 1727096157.45195: stdout chunk (state=3): >>><<< 11792 1727096157.45384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096157.4187124-13684-67390228501626=/root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096157.45387: variable 'ansible_module_compression' from source: unknown 11792 1727096157.45390: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11792 1727096157.45392: variable 'ansible_facts' from source: unknown 11792 1727096157.45553: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/AnsiballZ_network_connections.py 11792 1727096157.45743: Sending initial data 11792 1727096157.45849: Sent initial data (167 bytes) 11792 1727096157.46472: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096157.46512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096157.46557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096157.46710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096157.46790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096157.46810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096157.46870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096157.46945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096157.48637: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096157.48702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096157.48742: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpwexctsr0 /root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/AnsiballZ_network_connections.py <<< 11792 1727096157.48745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/AnsiballZ_network_connections.py" <<< 11792 1727096157.48804: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpwexctsr0" to remote "/root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/AnsiballZ_network_connections.py" <<< 11792 1727096157.50199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096157.50203: stderr chunk (state=3): >>><<< 11792 1727096157.50206: stdout chunk (state=3): >>><<< 11792 1727096157.50208: done transferring module to remote 11792 1727096157.50212: _low_level_execute_command(): starting 11792 1727096157.50215: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/ /root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/AnsiballZ_network_connections.py && sleep 0' 11792 1727096157.51023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096157.51049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096157.51055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096157.51133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096157.51142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096157.51147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096157.51209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096157.53334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096157.53338: stdout chunk (state=3): >>><<< 11792 1727096157.53340: stderr chunk (state=3): >>><<< 11792 1727096157.53343: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096157.53345: _low_level_execute_command(): starting 11792 1727096157.53347: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/AnsiballZ_network_connections.py && sleep 0' 11792 1727096157.53881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096157.53887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096157.53914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096157.53921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096157.53974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096157.53977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096157.53985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096157.54038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.11219: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 11792 1727096158.11261: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11792 1727096158.11265: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 11792 1727096158.11278: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/b2e827b6-dfb9-4571-949e-e48e368f579a: error=unknown <<< 11792 1727096158.13056: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4: error=unknown <<< 11792 1727096158.14769: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/d4ead546-ed37-4db8-b8f2-1191a6c9350f: error=unknown <<< 11792 1727096158.15014: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11792 1727096158.17301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096158.17305: stdout chunk (state=3): >>><<< 11792 1727096158.17307: stderr chunk (state=3): >>><<< 11792 1727096158.17341: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/b2e827b6-dfb9-4571-949e-e48e368f579a: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0e9b64bc-cd5f-4cf7-90f1-31d92bb27cd4: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__axhpo24/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/d4ead546-ed37-4db8-b8f2-1191a6c9350f: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096158.17408: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096158.17411: _low_level_execute_command(): starting 11792 1727096158.17414: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096157.4187124-13684-67390228501626/ > /dev/null 2>&1 && sleep 0' 11792 1727096158.17965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096158.17977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096158.18017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096158.18020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.18022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096158.18025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096158.18027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.18091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096158.18094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096158.18109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.18142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.20148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096158.20154: stdout chunk (state=3): >>><<< 11792 1727096158.20157: stderr chunk (state=3): >>><<< 11792 1727096158.20274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096158.20277: handler run complete 11792 1727096158.20280: attempt loop complete, returning result 11792 1727096158.20282: _execute() done 11792 1727096158.20283: dumping result to json 11792 1727096158.20285: done dumping result, returning 11792 1727096158.20286: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-d9c7-3fc0-0000000006a3] 11792 1727096158.20288: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a3 11792 1727096158.20592: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a3 11792 1727096158.20595: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11792 1727096158.20741: no more pending results, returning what we have 11792 1727096158.20746: results queue empty 11792 1727096158.20746: checking for any_errors_fatal 11792 1727096158.20756: done checking for any_errors_fatal 11792 1727096158.20757: checking for max_fail_percentage 11792 1727096158.20759: done checking for max_fail_percentage 11792 1727096158.20760: checking to see if all hosts have failed and the running result is not ok 11792 1727096158.20760: done checking to see if all hosts have failed 11792 1727096158.20761: getting the remaining hosts for this loop 11792 1727096158.20763: done getting the remaining hosts for this loop 11792 1727096158.20766: getting the next task for host managed_node2 11792 1727096158.20776: done getting next task for host managed_node2 11792 1727096158.20779: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11792 1727096158.20784: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096158.20800: getting variables 11792 1727096158.20802: in VariableManager get_vars() 11792 1727096158.20839: Calling all_inventory to load vars for managed_node2 11792 1727096158.20842: Calling groups_inventory to load vars for managed_node2 11792 1727096158.20845: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096158.20859: Calling all_plugins_play to load vars for managed_node2 11792 1727096158.20863: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096158.20866: Calling groups_plugins_play to load vars for managed_node2 11792 1727096158.22643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096158.24414: done with get_vars() 11792 1727096158.24438: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:58 -0400 (0:00:00.990) 0:00:40.524 ****** 11792 1727096158.24531: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11792 1727096158.24997: worker is 1 (out of 1 available) 11792 1727096158.25010: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11792 1727096158.25023: done queuing things up, now waiting for results queue to drain 11792 1727096158.25024: waiting for pending results... 11792 1727096158.25341: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11792 1727096158.25548: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a4 11792 1727096158.25554: variable 'ansible_search_path' from source: unknown 11792 1727096158.25558: variable 'ansible_search_path' from source: unknown 11792 1727096158.25562: calling self._execute() 11792 1727096158.25640: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.25664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.25683: variable 'omit' from source: magic vars 11792 1727096158.26089: variable 'ansible_distribution_major_version' from source: facts 11792 1727096158.26109: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096158.26236: variable 'network_state' from source: role '' defaults 11792 1727096158.26254: Evaluated conditional (network_state != {}): False 11792 1727096158.26263: when evaluation is False, skipping this task 11792 1727096158.26311: _execute() done 11792 1727096158.26315: dumping result to json 11792 1727096158.26318: done dumping result, returning 11792 1727096158.26320: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-d9c7-3fc0-0000000006a4] 11792 1727096158.26322: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a4 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096158.26463: no more pending results, returning what we have 11792 1727096158.26470: results queue empty 11792 1727096158.26471: checking for any_errors_fatal 11792 1727096158.26482: done checking for any_errors_fatal 11792 1727096158.26483: checking for max_fail_percentage 11792 1727096158.26484: done checking for max_fail_percentage 11792 1727096158.26485: checking to see if all hosts have failed and the running result is not ok 11792 1727096158.26486: done checking to see if all hosts have failed 11792 1727096158.26486: getting the remaining hosts for this loop 11792 1727096158.26488: done getting the remaining hosts for this loop 11792 1727096158.26491: getting the next task for host managed_node2 11792 1727096158.26499: done getting next task for host managed_node2 11792 1727096158.26503: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11792 1727096158.26508: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096158.26526: getting variables 11792 1727096158.26528: in VariableManager get_vars() 11792 1727096158.26672: Calling all_inventory to load vars for managed_node2 11792 1727096158.26677: Calling groups_inventory to load vars for managed_node2 11792 1727096158.26680: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096158.26687: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a4 11792 1727096158.26690: WORKER PROCESS EXITING 11792 1727096158.26702: Calling all_plugins_play to load vars for managed_node2 11792 1727096158.26706: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096158.26709: Calling groups_plugins_play to load vars for managed_node2 11792 1727096158.28344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096158.29989: done with get_vars() 11792 1727096158.30017: done getting variables 11792 1727096158.30084: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:58 -0400 (0:00:00.055) 0:00:40.580 ****** 11792 1727096158.30122: entering _queue_task() for managed_node2/debug 11792 1727096158.30613: worker is 1 (out of 1 available) 11792 1727096158.30623: exiting _queue_task() for managed_node2/debug 11792 1727096158.30636: done queuing things up, now waiting for results queue to drain 11792 1727096158.30637: waiting for pending results... 11792 1727096158.30847: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11792 1727096158.31027: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a5 11792 1727096158.31055: variable 'ansible_search_path' from source: unknown 11792 1727096158.31066: variable 'ansible_search_path' from source: unknown 11792 1727096158.31112: calling self._execute() 11792 1727096158.31246: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.31249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.31255: variable 'omit' from source: magic vars 11792 1727096158.31633: variable 'ansible_distribution_major_version' from source: facts 11792 1727096158.31650: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096158.31681: variable 'omit' from source: magic vars 11792 1727096158.31758: variable 'omit' from source: magic vars 11792 1727096158.31839: variable 'omit' from source: magic vars 11792 1727096158.31849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096158.31897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096158.31925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096158.31946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096158.32008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096158.32011: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096158.32013: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.32020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.32132: Set connection var ansible_timeout to 10 11792 1727096158.32146: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096158.32164: Set connection var ansible_shell_executable to /bin/sh 11792 1727096158.32176: Set connection var ansible_pipelining to False 11792 1727096158.32185: Set connection var ansible_shell_type to sh 11792 1727096158.32225: Set connection var ansible_connection to ssh 11792 1727096158.32228: variable 'ansible_shell_executable' from source: unknown 11792 1727096158.32236: variable 'ansible_connection' from source: unknown 11792 1727096158.32239: variable 'ansible_module_compression' from source: unknown 11792 1727096158.32246: variable 'ansible_shell_type' from source: unknown 11792 1727096158.32255: variable 'ansible_shell_executable' from source: unknown 11792 1727096158.32263: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.32334: variable 'ansible_pipelining' from source: unknown 11792 1727096158.32337: variable 'ansible_timeout' from source: unknown 11792 1727096158.32341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.32432: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096158.32460: variable 'omit' from source: magic vars 11792 1727096158.32473: starting attempt loop 11792 1727096158.32481: running the handler 11792 1727096158.32621: variable '__network_connections_result' from source: set_fact 11792 1727096158.32772: handler run complete 11792 1727096158.32776: attempt loop complete, returning result 11792 1727096158.32779: _execute() done 11792 1727096158.32781: dumping result to json 11792 1727096158.32784: done dumping result, returning 11792 1727096158.32786: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-d9c7-3fc0-0000000006a5] 11792 1727096158.32789: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a5 11792 1727096158.32861: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a5 11792 1727096158.32864: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 11792 1727096158.32931: no more pending results, returning what we have 11792 1727096158.32936: results queue empty 11792 1727096158.32937: checking for any_errors_fatal 11792 1727096158.32945: done checking for any_errors_fatal 11792 1727096158.32946: checking for max_fail_percentage 11792 1727096158.32948: done checking for max_fail_percentage 11792 1727096158.32949: checking to see if all hosts have failed and the running result is not ok 11792 1727096158.32950: done checking to see if all hosts have failed 11792 1727096158.32950: getting the remaining hosts for this loop 11792 1727096158.32955: done getting the remaining hosts for this loop 11792 1727096158.32959: getting the next task for host managed_node2 11792 1727096158.32967: done getting next task for host managed_node2 11792 1727096158.33175: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11792 1727096158.33180: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096158.33193: getting variables 11792 1727096158.33194: in VariableManager get_vars() 11792 1727096158.33231: Calling all_inventory to load vars for managed_node2 11792 1727096158.33233: Calling groups_inventory to load vars for managed_node2 11792 1727096158.33236: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096158.33246: Calling all_plugins_play to load vars for managed_node2 11792 1727096158.33249: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096158.33255: Calling groups_plugins_play to load vars for managed_node2 11792 1727096158.34983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096158.36586: done with get_vars() 11792 1727096158.36619: done getting variables 11792 1727096158.36697: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:58 -0400 (0:00:00.066) 0:00:40.646 ****** 11792 1727096158.36744: entering _queue_task() for managed_node2/debug 11792 1727096158.37132: worker is 1 (out of 1 available) 11792 1727096158.37144: exiting _queue_task() for managed_node2/debug 11792 1727096158.37161: done queuing things up, now waiting for results queue to drain 11792 1727096158.37163: waiting for pending results... 11792 1727096158.37586: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11792 1727096158.37676: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a6 11792 1727096158.37739: variable 'ansible_search_path' from source: unknown 11792 1727096158.37743: variable 'ansible_search_path' from source: unknown 11792 1727096158.37763: calling self._execute() 11792 1727096158.37866: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.37882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.37898: variable 'omit' from source: magic vars 11792 1727096158.38391: variable 'ansible_distribution_major_version' from source: facts 11792 1727096158.38395: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096158.38397: variable 'omit' from source: magic vars 11792 1727096158.38399: variable 'omit' from source: magic vars 11792 1727096158.38434: variable 'omit' from source: magic vars 11792 1727096158.38484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096158.38533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096158.38563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096158.38589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096158.38614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096158.38647: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096158.38658: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.38665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.38775: Set connection var ansible_timeout to 10 11792 1727096158.38791: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096158.38804: Set connection var ansible_shell_executable to /bin/sh 11792 1727096158.38813: Set connection var ansible_pipelining to False 11792 1727096158.38828: Set connection var ansible_shell_type to sh 11792 1727096158.38835: Set connection var ansible_connection to ssh 11792 1727096158.38865: variable 'ansible_shell_executable' from source: unknown 11792 1727096158.38876: variable 'ansible_connection' from source: unknown 11792 1727096158.38937: variable 'ansible_module_compression' from source: unknown 11792 1727096158.38940: variable 'ansible_shell_type' from source: unknown 11792 1727096158.38942: variable 'ansible_shell_executable' from source: unknown 11792 1727096158.38944: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.38946: variable 'ansible_pipelining' from source: unknown 11792 1727096158.38948: variable 'ansible_timeout' from source: unknown 11792 1727096158.38950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.39093: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096158.39111: variable 'omit' from source: magic vars 11792 1727096158.39121: starting attempt loop 11792 1727096158.39128: running the handler 11792 1727096158.39192: variable '__network_connections_result' from source: set_fact 11792 1727096158.39292: variable '__network_connections_result' from source: set_fact 11792 1727096158.39483: handler run complete 11792 1727096158.39487: attempt loop complete, returning result 11792 1727096158.39489: _execute() done 11792 1727096158.39491: dumping result to json 11792 1727096158.39493: done dumping result, returning 11792 1727096158.39503: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-d9c7-3fc0-0000000006a6] 11792 1727096158.39511: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a6 11792 1727096158.39716: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a6 11792 1727096158.39720: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11792 1727096158.39821: no more pending results, returning what we have 11792 1727096158.39825: results queue empty 11792 1727096158.39826: checking for any_errors_fatal 11792 1727096158.40075: done checking for any_errors_fatal 11792 1727096158.40076: checking for max_fail_percentage 11792 1727096158.40079: done checking for max_fail_percentage 11792 1727096158.40079: checking to see if all hosts have failed and the running result is not ok 11792 1727096158.40080: done checking to see if all hosts have failed 11792 1727096158.40081: getting the remaining hosts for this loop 11792 1727096158.40082: done getting the remaining hosts for this loop 11792 1727096158.40085: getting the next task for host managed_node2 11792 1727096158.40093: done getting next task for host managed_node2 11792 1727096158.40097: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11792 1727096158.40102: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096158.40115: getting variables 11792 1727096158.40116: in VariableManager get_vars() 11792 1727096158.40157: Calling all_inventory to load vars for managed_node2 11792 1727096158.40166: Calling groups_inventory to load vars for managed_node2 11792 1727096158.40175: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096158.40187: Calling all_plugins_play to load vars for managed_node2 11792 1727096158.40190: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096158.40193: Calling groups_plugins_play to load vars for managed_node2 11792 1727096158.41626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096158.43264: done with get_vars() 11792 1727096158.43296: done getting variables 11792 1727096158.43358: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:58 -0400 (0:00:00.066) 0:00:40.713 ****** 11792 1727096158.43394: entering _queue_task() for managed_node2/debug 11792 1727096158.43900: worker is 1 (out of 1 available) 11792 1727096158.43911: exiting _queue_task() for managed_node2/debug 11792 1727096158.43923: done queuing things up, now waiting for results queue to drain 11792 1727096158.43925: waiting for pending results... 11792 1727096158.44325: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11792 1727096158.44338: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a7 11792 1727096158.44366: variable 'ansible_search_path' from source: unknown 11792 1727096158.44377: variable 'ansible_search_path' from source: unknown 11792 1727096158.44425: calling self._execute() 11792 1727096158.44533: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.44547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.44565: variable 'omit' from source: magic vars 11792 1727096158.44982: variable 'ansible_distribution_major_version' from source: facts 11792 1727096158.44996: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096158.45110: variable 'network_state' from source: role '' defaults 11792 1727096158.45124: Evaluated conditional (network_state != {}): False 11792 1727096158.45131: when evaluation is False, skipping this task 11792 1727096158.45137: _execute() done 11792 1727096158.45143: dumping result to json 11792 1727096158.45148: done dumping result, returning 11792 1727096158.45163: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-d9c7-3fc0-0000000006a7] 11792 1727096158.45174: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a7 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11792 1727096158.45336: no more pending results, returning what we have 11792 1727096158.45341: results queue empty 11792 1727096158.45342: checking for any_errors_fatal 11792 1727096158.45357: done checking for any_errors_fatal 11792 1727096158.45358: checking for max_fail_percentage 11792 1727096158.45360: done checking for max_fail_percentage 11792 1727096158.45362: checking to see if all hosts have failed and the running result is not ok 11792 1727096158.45362: done checking to see if all hosts have failed 11792 1727096158.45363: getting the remaining hosts for this loop 11792 1727096158.45365: done getting the remaining hosts for this loop 11792 1727096158.45371: getting the next task for host managed_node2 11792 1727096158.45378: done getting next task for host managed_node2 11792 1727096158.45382: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11792 1727096158.45388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096158.45408: getting variables 11792 1727096158.45410: in VariableManager get_vars() 11792 1727096158.45450: Calling all_inventory to load vars for managed_node2 11792 1727096158.45455: Calling groups_inventory to load vars for managed_node2 11792 1727096158.45458: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096158.45686: Calling all_plugins_play to load vars for managed_node2 11792 1727096158.45690: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096158.45695: Calling groups_plugins_play to load vars for managed_node2 11792 1727096158.46304: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a7 11792 1727096158.46307: WORKER PROCESS EXITING 11792 1727096158.47407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096158.49039: done with get_vars() 11792 1727096158.49079: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:58 -0400 (0:00:00.057) 0:00:40.771 ****** 11792 1727096158.49188: entering _queue_task() for managed_node2/ping 11792 1727096158.49569: worker is 1 (out of 1 available) 11792 1727096158.49581: exiting _queue_task() for managed_node2/ping 11792 1727096158.49707: done queuing things up, now waiting for results queue to drain 11792 1727096158.49709: waiting for pending results... 11792 1727096158.49926: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11792 1727096158.50110: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006a8 11792 1727096158.50131: variable 'ansible_search_path' from source: unknown 11792 1727096158.50139: variable 'ansible_search_path' from source: unknown 11792 1727096158.50265: calling self._execute() 11792 1727096158.50292: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.50305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.50319: variable 'omit' from source: magic vars 11792 1727096158.50714: variable 'ansible_distribution_major_version' from source: facts 11792 1727096158.50732: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096158.50743: variable 'omit' from source: magic vars 11792 1727096158.50832: variable 'omit' from source: magic vars 11792 1727096158.50877: variable 'omit' from source: magic vars 11792 1727096158.50930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096158.50975: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096158.51003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096158.51035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096158.51051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096158.51138: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096158.51141: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.51144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.51219: Set connection var ansible_timeout to 10 11792 1727096158.51235: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096158.51261: Set connection var ansible_shell_executable to /bin/sh 11792 1727096158.51274: Set connection var ansible_pipelining to False 11792 1727096158.51281: Set connection var ansible_shell_type to sh 11792 1727096158.51288: Set connection var ansible_connection to ssh 11792 1727096158.51314: variable 'ansible_shell_executable' from source: unknown 11792 1727096158.51359: variable 'ansible_connection' from source: unknown 11792 1727096158.51363: variable 'ansible_module_compression' from source: unknown 11792 1727096158.51365: variable 'ansible_shell_type' from source: unknown 11792 1727096158.51369: variable 'ansible_shell_executable' from source: unknown 11792 1727096158.51371: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.51373: variable 'ansible_pipelining' from source: unknown 11792 1727096158.51375: variable 'ansible_timeout' from source: unknown 11792 1727096158.51377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.51599: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096158.51672: variable 'omit' from source: magic vars 11792 1727096158.51677: starting attempt loop 11792 1727096158.51680: running the handler 11792 1727096158.51682: _low_level_execute_command(): starting 11792 1727096158.51684: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096158.52427: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096158.52492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.52495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096158.52595: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096158.52649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.52701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.54397: stdout chunk (state=3): >>>/root <<< 11792 1727096158.54562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096158.54565: stdout chunk (state=3): >>><<< 11792 1727096158.54570: stderr chunk (state=3): >>><<< 11792 1727096158.54592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096158.54612: _low_level_execute_command(): starting 11792 1727096158.54704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785 `" && echo ansible-tmp-1727096158.5459926-13728-142287715860785="` echo /root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785 `" ) && sleep 0' 11792 1727096158.55311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096158.55314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096158.55325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.55373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096158.55394: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.55449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096158.55489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096158.55534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.55576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.57596: stdout chunk (state=3): >>>ansible-tmp-1727096158.5459926-13728-142287715860785=/root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785 <<< 11792 1727096158.57764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096158.57770: stdout chunk (state=3): >>><<< 11792 1727096158.57773: stderr chunk (state=3): >>><<< 11792 1727096158.57973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096158.5459926-13728-142287715860785=/root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096158.57977: variable 'ansible_module_compression' from source: unknown 11792 1727096158.57979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11792 1727096158.57982: variable 'ansible_facts' from source: unknown 11792 1727096158.58023: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/AnsiballZ_ping.py 11792 1727096158.58280: Sending initial data 11792 1727096158.58283: Sent initial data (153 bytes) 11792 1727096158.58874: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096158.58879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096158.58881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096158.58883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096158.58885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096158.58947: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096158.59181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.59246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.60937: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096158.60987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096158.61017: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpmd3zckqc /root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/AnsiballZ_ping.py <<< 11792 1727096158.61020: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/AnsiballZ_ping.py" <<< 11792 1727096158.61084: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpmd3zckqc" to remote "/root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/AnsiballZ_ping.py" <<< 11792 1727096158.61937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096158.61941: stdout chunk (state=3): >>><<< 11792 1727096158.61943: stderr chunk (state=3): >>><<< 11792 1727096158.61946: done transferring module to remote 11792 1727096158.61950: _low_level_execute_command(): starting 11792 1727096158.61965: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/ /root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/AnsiballZ_ping.py && sleep 0' 11792 1727096158.62589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096158.62601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096158.62605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096158.62621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096158.62634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096158.62642: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096158.62652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.62666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096158.62678: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096158.62685: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096158.62692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096158.62711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096158.62781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096158.62789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.62859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.64843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096158.64847: stdout chunk (state=3): >>><<< 11792 1727096158.64850: stderr chunk (state=3): >>><<< 11792 1727096158.64950: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096158.64960: _low_level_execute_command(): starting 11792 1727096158.64963: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/AnsiballZ_ping.py && sleep 0' 11792 1727096158.65586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.65625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096158.65641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096158.65666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.65747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.81733: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11792 1727096158.82930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096158.82996: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 11792 1727096158.83029: stderr chunk (state=3): >>><<< 11792 1727096158.83039: stdout chunk (state=3): >>><<< 11792 1727096158.83073: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096158.83111: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096158.83174: _low_level_execute_command(): starting 11792 1727096158.83178: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096158.5459926-13728-142287715860785/ > /dev/null 2>&1 && sleep 0' 11792 1727096158.83836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096158.83851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096158.83874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096158.83894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096158.83956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.84017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096158.84068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096158.84109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.84165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.86196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096158.86200: stdout chunk (state=3): >>><<< 11792 1727096158.86203: stderr chunk (state=3): >>><<< 11792 1727096158.86205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096158.86208: handler run complete 11792 1727096158.86226: attempt loop complete, returning result 11792 1727096158.86230: _execute() done 11792 1727096158.86232: dumping result to json 11792 1727096158.86234: done dumping result, returning 11792 1727096158.86303: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-d9c7-3fc0-0000000006a8] 11792 1727096158.86306: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a8 11792 1727096158.86373: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006a8 11792 1727096158.86375: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11792 1727096158.86464: no more pending results, returning what we have 11792 1727096158.86470: results queue empty 11792 1727096158.86471: checking for any_errors_fatal 11792 1727096158.86477: done checking for any_errors_fatal 11792 1727096158.86477: checking for max_fail_percentage 11792 1727096158.86479: done checking for max_fail_percentage 11792 1727096158.86480: checking to see if all hosts have failed and the running result is not ok 11792 1727096158.86480: done checking to see if all hosts have failed 11792 1727096158.86481: getting the remaining hosts for this loop 11792 1727096158.86482: done getting the remaining hosts for this loop 11792 1727096158.86485: getting the next task for host managed_node2 11792 1727096158.86495: done getting next task for host managed_node2 11792 1727096158.86496: ^ task is: TASK: meta (role_complete) 11792 1727096158.86502: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096158.86512: getting variables 11792 1727096158.86513: in VariableManager get_vars() 11792 1727096158.86549: Calling all_inventory to load vars for managed_node2 11792 1727096158.86551: Calling groups_inventory to load vars for managed_node2 11792 1727096158.86556: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096158.86565: Calling all_plugins_play to load vars for managed_node2 11792 1727096158.86630: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096158.86636: Calling groups_plugins_play to load vars for managed_node2 11792 1727096158.87944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096158.89686: done with get_vars() 11792 1727096158.89711: done getting variables 11792 1727096158.89795: done queuing things up, now waiting for results queue to drain 11792 1727096158.89798: results queue empty 11792 1727096158.89798: checking for any_errors_fatal 11792 1727096158.89801: done checking for any_errors_fatal 11792 1727096158.89802: checking for max_fail_percentage 11792 1727096158.89803: done checking for max_fail_percentage 11792 1727096158.89804: checking to see if all hosts have failed and the running result is not ok 11792 1727096158.89805: done checking to see if all hosts have failed 11792 1727096158.89805: getting the remaining hosts for this loop 11792 1727096158.89806: done getting the remaining hosts for this loop 11792 1727096158.89809: getting the next task for host managed_node2 11792 1727096158.89813: done getting next task for host managed_node2 11792 1727096158.89815: ^ task is: TASK: Delete the device '{{ controller_device }}' 11792 1727096158.89818: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096158.89820: getting variables 11792 1727096158.89821: in VariableManager get_vars() 11792 1727096158.89835: Calling all_inventory to load vars for managed_node2 11792 1727096158.89837: Calling groups_inventory to load vars for managed_node2 11792 1727096158.89839: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096158.89844: Calling all_plugins_play to load vars for managed_node2 11792 1727096158.89846: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096158.89848: Calling groups_plugins_play to load vars for managed_node2 11792 1727096158.90946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096158.92474: done with get_vars() 11792 1727096158.92507: done getting variables 11792 1727096158.92551: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096158.92682: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Monday 23 September 2024 08:55:58 -0400 (0:00:00.435) 0:00:41.206 ****** 11792 1727096158.92715: entering _queue_task() for managed_node2/command 11792 1727096158.93081: worker is 1 (out of 1 available) 11792 1727096158.93093: exiting _queue_task() for managed_node2/command 11792 1727096158.93109: done queuing things up, now waiting for results queue to drain 11792 1727096158.93111: waiting for pending results... 11792 1727096158.93498: running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' 11792 1727096158.93774: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006d8 11792 1727096158.93778: variable 'ansible_search_path' from source: unknown 11792 1727096158.93781: variable 'ansible_search_path' from source: unknown 11792 1727096158.93784: calling self._execute() 11792 1727096158.93786: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.93789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.93791: variable 'omit' from source: magic vars 11792 1727096158.94176: variable 'ansible_distribution_major_version' from source: facts 11792 1727096158.94194: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096158.94206: variable 'omit' from source: magic vars 11792 1727096158.94239: variable 'omit' from source: magic vars 11792 1727096158.94335: variable 'controller_device' from source: play vars 11792 1727096158.94366: variable 'omit' from source: magic vars 11792 1727096158.94415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096158.94460: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096158.94488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096158.94511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096158.94525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096158.94567: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096158.94578: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.94586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.94694: Set connection var ansible_timeout to 10 11792 1727096158.94772: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096158.94777: Set connection var ansible_shell_executable to /bin/sh 11792 1727096158.94782: Set connection var ansible_pipelining to False 11792 1727096158.94785: Set connection var ansible_shell_type to sh 11792 1727096158.94787: Set connection var ansible_connection to ssh 11792 1727096158.94789: variable 'ansible_shell_executable' from source: unknown 11792 1727096158.94791: variable 'ansible_connection' from source: unknown 11792 1727096158.94793: variable 'ansible_module_compression' from source: unknown 11792 1727096158.94795: variable 'ansible_shell_type' from source: unknown 11792 1727096158.94797: variable 'ansible_shell_executable' from source: unknown 11792 1727096158.94799: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096158.94801: variable 'ansible_pipelining' from source: unknown 11792 1727096158.94803: variable 'ansible_timeout' from source: unknown 11792 1727096158.94806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096158.94957: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096158.95006: variable 'omit' from source: magic vars 11792 1727096158.95017: starting attempt loop 11792 1727096158.95024: running the handler 11792 1727096158.95047: _low_level_execute_command(): starting 11792 1727096158.95061: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096158.95993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.96019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096158.96076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.96106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096158.98060: stdout chunk (state=3): >>>/root <<< 11792 1727096158.98063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096158.98066: stdout chunk (state=3): >>><<< 11792 1727096158.98070: stderr chunk (state=3): >>><<< 11792 1727096158.98091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096158.98251: _low_level_execute_command(): starting 11792 1727096158.98255: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978 `" && echo ansible-tmp-1727096158.981453-13754-259032195228978="` echo /root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978 `" ) && sleep 0' 11792 1727096158.99388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096158.99615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096158.99697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096158.99706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.01751: stdout chunk (state=3): >>>ansible-tmp-1727096158.981453-13754-259032195228978=/root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978 <<< 11792 1727096159.01941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.01989: stderr chunk (state=3): >>><<< 11792 1727096159.02047: stdout chunk (state=3): >>><<< 11792 1727096159.02066: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096158.981453-13754-259032195228978=/root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096159.02106: variable 'ansible_module_compression' from source: unknown 11792 1727096159.02374: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096159.02377: variable 'ansible_facts' from source: unknown 11792 1727096159.02458: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/AnsiballZ_command.py 11792 1727096159.02791: Sending initial data 11792 1727096159.02802: Sent initial data (155 bytes) 11792 1727096159.04096: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096159.04196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.04215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.04291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.05941: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096159.05949: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11792 1727096159.05960: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11792 1727096159.05997: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096159.06031: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096159.06106: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp0xjda0cj /root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/AnsiballZ_command.py <<< 11792 1727096159.06126: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp0xjda0cj" to remote "/root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/AnsiballZ_command.py" <<< 11792 1727096159.07188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.07192: stderr chunk (state=3): >>><<< 11792 1727096159.07195: stdout chunk (state=3): >>><<< 11792 1727096159.07218: done transferring module to remote 11792 1727096159.07230: _low_level_execute_command(): starting 11792 1727096159.07233: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/ /root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/AnsiballZ_command.py && sleep 0' 11792 1727096159.07862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096159.07878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096159.07885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096159.07901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096159.07984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.08003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096159.08059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.08064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.08093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.10011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.10035: stderr chunk (state=3): >>><<< 11792 1727096159.10038: stdout chunk (state=3): >>><<< 11792 1727096159.10052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096159.10081: _low_level_execute_command(): starting 11792 1727096159.10088: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/AnsiballZ_command.py && sleep 0' 11792 1727096159.10744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096159.10772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.10822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096159.10840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.10889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096159.10915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.10950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.11019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.27819: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-23 08:55:59.268762", "end": "2024-09-23 08:55:59.276617", "delta": "0:00:00.007855", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096159.29575: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.126 closed. <<< 11792 1727096159.29579: stdout chunk (state=3): >>><<< 11792 1727096159.29582: stderr chunk (state=3): >>><<< 11792 1727096159.29584: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-23 08:55:59.268762", "end": "2024-09-23 08:55:59.276617", "delta": "0:00:00.007855", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.126 closed. 11792 1727096159.29587: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096159.29590: _low_level_execute_command(): starting 11792 1727096159.29592: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096158.981453-13754-259032195228978/ > /dev/null 2>&1 && sleep 0' 11792 1727096159.30198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096159.30213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096159.30271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.30342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096159.30374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.30402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.30473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.32475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.32488: stderr chunk (state=3): >>><<< 11792 1727096159.32491: stdout chunk (state=3): >>><<< 11792 1727096159.32512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096159.32519: handler run complete 11792 1727096159.32545: Evaluated conditional (False): False 11792 1727096159.32548: Evaluated conditional (False): False 11792 1727096159.32560: attempt loop complete, returning result 11792 1727096159.32563: _execute() done 11792 1727096159.32566: dumping result to json 11792 1727096159.32576: done dumping result, returning 11792 1727096159.32592: done running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' [0afff68d-5257-d9c7-3fc0-0000000006d8] 11792 1727096159.32602: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006d8 11792 1727096159.32749: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006d8 11792 1727096159.32752: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007855", "end": "2024-09-23 08:55:59.276617", "failed_when_result": false, "rc": 1, "start": "2024-09-23 08:55:59.268762" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11792 1727096159.32833: no more pending results, returning what we have 11792 1727096159.32837: results queue empty 11792 1727096159.32838: checking for any_errors_fatal 11792 1727096159.32841: done checking for any_errors_fatal 11792 1727096159.32842: checking for max_fail_percentage 11792 1727096159.32844: done checking for max_fail_percentage 11792 1727096159.32844: checking to see if all hosts have failed and the running result is not ok 11792 1727096159.32845: done checking to see if all hosts have failed 11792 1727096159.32846: getting the remaining hosts for this loop 11792 1727096159.32847: done getting the remaining hosts for this loop 11792 1727096159.32851: getting the next task for host managed_node2 11792 1727096159.32864: done getting next task for host managed_node2 11792 1727096159.32868: ^ task is: TASK: Remove test interfaces 11792 1727096159.32872: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096159.32877: getting variables 11792 1727096159.32880: in VariableManager get_vars() 11792 1727096159.32922: Calling all_inventory to load vars for managed_node2 11792 1727096159.32925: Calling groups_inventory to load vars for managed_node2 11792 1727096159.32928: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096159.32941: Calling all_plugins_play to load vars for managed_node2 11792 1727096159.32944: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096159.32947: Calling groups_plugins_play to load vars for managed_node2 11792 1727096159.34877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096159.36744: done with get_vars() 11792 1727096159.36787: done getting variables 11792 1727096159.36856: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:55:59 -0400 (0:00:00.441) 0:00:41.648 ****** 11792 1727096159.36894: entering _queue_task() for managed_node2/shell 11792 1727096159.37477: worker is 1 (out of 1 available) 11792 1727096159.37489: exiting _queue_task() for managed_node2/shell 11792 1727096159.37500: done queuing things up, now waiting for results queue to drain 11792 1727096159.37501: waiting for pending results... 11792 1727096159.38172: running TaskExecutor() for managed_node2/TASK: Remove test interfaces 11792 1727096159.38178: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006de 11792 1727096159.38182: variable 'ansible_search_path' from source: unknown 11792 1727096159.38185: variable 'ansible_search_path' from source: unknown 11792 1727096159.38188: calling self._execute() 11792 1727096159.38242: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096159.38259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096159.38277: variable 'omit' from source: magic vars 11792 1727096159.38672: variable 'ansible_distribution_major_version' from source: facts 11792 1727096159.38691: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096159.38702: variable 'omit' from source: magic vars 11792 1727096159.38771: variable 'omit' from source: magic vars 11792 1727096159.38936: variable 'dhcp_interface1' from source: play vars 11792 1727096159.38950: variable 'dhcp_interface2' from source: play vars 11792 1727096159.38985: variable 'omit' from source: magic vars 11792 1727096159.39032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096159.39137: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096159.39166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096159.39195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096159.39210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096159.39244: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096159.39298: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096159.39301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096159.39449: Set connection var ansible_timeout to 10 11792 1727096159.39469: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096159.39485: Set connection var ansible_shell_executable to /bin/sh 11792 1727096159.39577: Set connection var ansible_pipelining to False 11792 1727096159.39579: Set connection var ansible_shell_type to sh 11792 1727096159.39581: Set connection var ansible_connection to ssh 11792 1727096159.39583: variable 'ansible_shell_executable' from source: unknown 11792 1727096159.39585: variable 'ansible_connection' from source: unknown 11792 1727096159.39587: variable 'ansible_module_compression' from source: unknown 11792 1727096159.39589: variable 'ansible_shell_type' from source: unknown 11792 1727096159.39590: variable 'ansible_shell_executable' from source: unknown 11792 1727096159.39592: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096159.39594: variable 'ansible_pipelining' from source: unknown 11792 1727096159.39595: variable 'ansible_timeout' from source: unknown 11792 1727096159.39597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096159.39724: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096159.39739: variable 'omit' from source: magic vars 11792 1727096159.39747: starting attempt loop 11792 1727096159.39756: running the handler 11792 1727096159.39772: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096159.39797: _low_level_execute_command(): starting 11792 1727096159.39809: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096159.40513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096159.40577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.40663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096159.40687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.40723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.40785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.42721: stdout chunk (state=3): >>>/root <<< 11792 1727096159.42949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.42956: stdout chunk (state=3): >>><<< 11792 1727096159.42958: stderr chunk (state=3): >>><<< 11792 1727096159.42961: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096159.42965: _low_level_execute_command(): starting 11792 1727096159.42971: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670 `" && echo ansible-tmp-1727096159.4285066-13781-201353705693670="` echo /root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670 `" ) && sleep 0' 11792 1727096159.44144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096159.44163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.44170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096159.44173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.44415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.44420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.46473: stdout chunk (state=3): >>>ansible-tmp-1727096159.4285066-13781-201353705693670=/root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670 <<< 11792 1727096159.46664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.46693: stderr chunk (state=3): >>><<< 11792 1727096159.47075: stdout chunk (state=3): >>><<< 11792 1727096159.47079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096159.4285066-13781-201353705693670=/root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096159.47081: variable 'ansible_module_compression' from source: unknown 11792 1727096159.47083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096159.47085: variable 'ansible_facts' from source: unknown 11792 1727096159.47225: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/AnsiballZ_command.py 11792 1727096159.47903: Sending initial data 11792 1727096159.47908: Sent initial data (156 bytes) 11792 1727096159.48669: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096159.48781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096159.48796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.49042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.49076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.49182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.50943: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096159.51006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096159.51184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpm3i0af95 /root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/AnsiballZ_command.py <<< 11792 1727096159.51347: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpm3i0af95" to remote "/root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/AnsiballZ_command.py" <<< 11792 1727096159.52352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.52410: stderr chunk (state=3): >>><<< 11792 1727096159.52421: stdout chunk (state=3): >>><<< 11792 1727096159.52451: done transferring module to remote 11792 1727096159.52577: _low_level_execute_command(): starting 11792 1727096159.52580: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/ /root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/AnsiballZ_command.py && sleep 0' 11792 1727096159.54094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.54188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.54284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.56144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.56148: stdout chunk (state=3): >>><<< 11792 1727096159.56159: stderr chunk (state=3): >>><<< 11792 1727096159.56264: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096159.56269: _low_level_execute_command(): starting 11792 1727096159.56272: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/AnsiballZ_command.py && sleep 0' 11792 1727096159.57521: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096159.57557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.57706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096159.57709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.57987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.57991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.77996: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-23 08:55:59.738698", "end": "2024-09-23 08:55:59.778457", "delta": "0:00:00.039759", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096159.79704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096159.79709: stdout chunk (state=3): >>><<< 11792 1727096159.79711: stderr chunk (state=3): >>><<< 11792 1727096159.79734: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-23 08:55:59.738698", "end": "2024-09-23 08:55:59.778457", "delta": "0:00:00.039759", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096159.79777: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096159.79786: _low_level_execute_command(): starting 11792 1727096159.79789: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096159.4285066-13781-201353705693670/ > /dev/null 2>&1 && sleep 0' 11792 1727096159.80395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096159.80398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096159.80411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096159.80427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096159.80440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096159.80447: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096159.80457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.80474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096159.80482: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096159.80489: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096159.80502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096159.80508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096159.80519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096159.80527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096159.80534: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096159.80544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.80632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096159.80636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.80644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.80721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.82880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.82884: stdout chunk (state=3): >>><<< 11792 1727096159.82887: stderr chunk (state=3): >>><<< 11792 1727096159.82889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096159.82891: handler run complete 11792 1727096159.82893: Evaluated conditional (False): False 11792 1727096159.82895: attempt loop complete, returning result 11792 1727096159.82897: _execute() done 11792 1727096159.82899: dumping result to json 11792 1727096159.82901: done dumping result, returning 11792 1727096159.82902: done running TaskExecutor() for managed_node2/TASK: Remove test interfaces [0afff68d-5257-d9c7-3fc0-0000000006de] 11792 1727096159.82904: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006de ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.039759", "end": "2024-09-23 08:55:59.778457", "rc": 0, "start": "2024-09-23 08:55:59.738698" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11792 1727096159.83311: no more pending results, returning what we have 11792 1727096159.83315: results queue empty 11792 1727096159.83316: checking for any_errors_fatal 11792 1727096159.83330: done checking for any_errors_fatal 11792 1727096159.83331: checking for max_fail_percentage 11792 1727096159.83334: done checking for max_fail_percentage 11792 1727096159.83335: checking to see if all hosts have failed and the running result is not ok 11792 1727096159.83336: done checking to see if all hosts have failed 11792 1727096159.83336: getting the remaining hosts for this loop 11792 1727096159.83338: done getting the remaining hosts for this loop 11792 1727096159.83344: getting the next task for host managed_node2 11792 1727096159.83355: done getting next task for host managed_node2 11792 1727096159.83358: ^ task is: TASK: Stop dnsmasq/radvd services 11792 1727096159.83362: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096159.83770: getting variables 11792 1727096159.83772: in VariableManager get_vars() 11792 1727096159.83810: Calling all_inventory to load vars for managed_node2 11792 1727096159.83813: Calling groups_inventory to load vars for managed_node2 11792 1727096159.83815: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096159.83984: Calling all_plugins_play to load vars for managed_node2 11792 1727096159.83988: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096159.83991: Calling groups_plugins_play to load vars for managed_node2 11792 1727096159.84603: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006de 11792 1727096159.84607: WORKER PROCESS EXITING 11792 1727096159.86713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096159.90689: done with get_vars() 11792 1727096159.90725: done getting variables 11792 1727096159.90929: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Monday 23 September 2024 08:55:59 -0400 (0:00:00.540) 0:00:42.188 ****** 11792 1727096159.91064: entering _queue_task() for managed_node2/shell 11792 1727096159.91484: worker is 1 (out of 1 available) 11792 1727096159.91497: exiting _queue_task() for managed_node2/shell 11792 1727096159.91511: done queuing things up, now waiting for results queue to drain 11792 1727096159.91512: waiting for pending results... 11792 1727096159.91822: running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services 11792 1727096159.91949: in run() - task 0afff68d-5257-d9c7-3fc0-0000000006df 11792 1727096159.91998: variable 'ansible_search_path' from source: unknown 11792 1727096159.92005: variable 'ansible_search_path' from source: unknown 11792 1727096159.92046: calling self._execute() 11792 1727096159.92161: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096159.92177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096159.92191: variable 'omit' from source: magic vars 11792 1727096159.92580: variable 'ansible_distribution_major_version' from source: facts 11792 1727096159.92596: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096159.92608: variable 'omit' from source: magic vars 11792 1727096159.92676: variable 'omit' from source: magic vars 11792 1727096159.92713: variable 'omit' from source: magic vars 11792 1727096159.92766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096159.92809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096159.92834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096159.92866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096159.92886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096159.92992: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096159.93046: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096159.93082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096159.93346: Set connection var ansible_timeout to 10 11792 1727096159.93473: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096159.93477: Set connection var ansible_shell_executable to /bin/sh 11792 1727096159.93479: Set connection var ansible_pipelining to False 11792 1727096159.93481: Set connection var ansible_shell_type to sh 11792 1727096159.93483: Set connection var ansible_connection to ssh 11792 1727096159.93484: variable 'ansible_shell_executable' from source: unknown 11792 1727096159.93487: variable 'ansible_connection' from source: unknown 11792 1727096159.93489: variable 'ansible_module_compression' from source: unknown 11792 1727096159.93672: variable 'ansible_shell_type' from source: unknown 11792 1727096159.93675: variable 'ansible_shell_executable' from source: unknown 11792 1727096159.93678: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096159.93680: variable 'ansible_pipelining' from source: unknown 11792 1727096159.93682: variable 'ansible_timeout' from source: unknown 11792 1727096159.93685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096159.94403: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096159.94474: variable 'omit' from source: magic vars 11792 1727096159.94477: starting attempt loop 11792 1727096159.94480: running the handler 11792 1727096159.94483: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096159.94485: _low_level_execute_command(): starting 11792 1727096159.94487: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096159.95828: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.95873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096159.96091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.96152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096159.97895: stdout chunk (state=3): >>>/root <<< 11792 1727096159.98025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096159.98030: stdout chunk (state=3): >>><<< 11792 1727096159.98038: stderr chunk (state=3): >>><<< 11792 1727096159.98062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096159.98080: _low_level_execute_command(): starting 11792 1727096159.98090: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955 `" && echo ansible-tmp-1727096159.9806414-13817-121444454116955="` echo /root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955 `" ) && sleep 0' 11792 1727096159.99387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096159.99629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096159.99636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096159.99703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096160.01758: stdout chunk (state=3): >>>ansible-tmp-1727096159.9806414-13817-121444454116955=/root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955 <<< 11792 1727096160.02441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096160.02445: stdout chunk (state=3): >>><<< 11792 1727096160.02447: stderr chunk (state=3): >>><<< 11792 1727096160.02450: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096159.9806414-13817-121444454116955=/root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096160.02454: variable 'ansible_module_compression' from source: unknown 11792 1727096160.02457: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096160.02524: variable 'ansible_facts' from source: unknown 11792 1727096160.02711: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/AnsiballZ_command.py 11792 1727096160.03081: Sending initial data 11792 1727096160.03092: Sent initial data (156 bytes) 11792 1727096160.03997: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096160.04016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096160.04047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096160.04086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096160.04100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096160.04161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096160.04216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096160.04243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096160.04297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096160.04380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096160.06071: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096160.06580: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/AnsiballZ_command.py" <<< 11792 1727096160.06585: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmplpiptmox /root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/AnsiballZ_command.py <<< 11792 1727096160.06658: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmplpiptmox" to remote "/root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/AnsiballZ_command.py" <<< 11792 1727096160.07777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096160.07840: stderr chunk (state=3): >>><<< 11792 1727096160.07846: stdout chunk (state=3): >>><<< 11792 1727096160.07874: done transferring module to remote 11792 1727096160.07885: _low_level_execute_command(): starting 11792 1727096160.07891: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/ /root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/AnsiballZ_command.py && sleep 0' 11792 1727096160.09185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096160.09277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096160.09368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096160.09491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096160.09508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096160.09532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096160.09687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096160.11696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096160.11700: stdout chunk (state=3): >>><<< 11792 1727096160.11702: stderr chunk (state=3): >>><<< 11792 1727096160.11720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096160.11883: _low_level_execute_command(): starting 11792 1727096160.11887: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/AnsiballZ_command.py && sleep 0' 11792 1727096160.12826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096160.12900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096160.13123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096160.13286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096160.13352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096160.32073: stdout chunk (state=3): >>> <<< 11792 1727096160.32108: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-23 08:56:00.291280", "end": "2024-09-23 08:56:00.319178", "delta": "0:00:00.027898", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096160.33812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096160.33817: stdout chunk (state=3): >>><<< 11792 1727096160.33976: stderr chunk (state=3): >>><<< 11792 1727096160.33981: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-23 08:56:00.291280", "end": "2024-09-23 08:56:00.319178", "delta": "0:00:00.027898", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096160.33989: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096160.33992: _low_level_execute_command(): starting 11792 1727096160.33994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096159.9806414-13817-121444454116955/ > /dev/null 2>&1 && sleep 0' 11792 1727096160.34528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096160.34543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096160.34560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096160.34672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096160.34695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096160.34760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096160.36756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096160.36772: stdout chunk (state=3): >>><<< 11792 1727096160.37174: stderr chunk (state=3): >>><<< 11792 1727096160.37179: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096160.37182: handler run complete 11792 1727096160.37184: Evaluated conditional (False): False 11792 1727096160.37185: attempt loop complete, returning result 11792 1727096160.37187: _execute() done 11792 1727096160.37188: dumping result to json 11792 1727096160.37190: done dumping result, returning 11792 1727096160.37191: done running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services [0afff68d-5257-d9c7-3fc0-0000000006df] 11792 1727096160.37193: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006df ok: [managed_node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.027898", "end": "2024-09-23 08:56:00.319178", "rc": 0, "start": "2024-09-23 08:56:00.291280" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11792 1727096160.37335: no more pending results, returning what we have 11792 1727096160.37340: results queue empty 11792 1727096160.37341: checking for any_errors_fatal 11792 1727096160.37348: done checking for any_errors_fatal 11792 1727096160.37349: checking for max_fail_percentage 11792 1727096160.37351: done checking for max_fail_percentage 11792 1727096160.37352: checking to see if all hosts have failed and the running result is not ok 11792 1727096160.37352: done checking to see if all hosts have failed 11792 1727096160.37353: getting the remaining hosts for this loop 11792 1727096160.37355: done getting the remaining hosts for this loop 11792 1727096160.37359: getting the next task for host managed_node2 11792 1727096160.37374: done getting next task for host managed_node2 11792 1727096160.37377: ^ task is: TASK: Reset bond options to assert 11792 1727096160.37379: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096160.37384: getting variables 11792 1727096160.37385: in VariableManager get_vars() 11792 1727096160.37429: Calling all_inventory to load vars for managed_node2 11792 1727096160.37432: Calling groups_inventory to load vars for managed_node2 11792 1727096160.37435: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096160.37449: Calling all_plugins_play to load vars for managed_node2 11792 1727096160.37452: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096160.37456: Calling groups_plugins_play to load vars for managed_node2 11792 1727096160.38274: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000006df 11792 1727096160.38278: WORKER PROCESS EXITING 11792 1727096160.52214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096160.55504: done with get_vars() 11792 1727096160.55539: done getting variables 11792 1727096160.55589: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reset bond options to assert] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:59 Monday 23 September 2024 08:56:00 -0400 (0:00:00.645) 0:00:42.835 ****** 11792 1727096160.55617: entering _queue_task() for managed_node2/set_fact 11792 1727096160.56366: worker is 1 (out of 1 available) 11792 1727096160.56382: exiting _queue_task() for managed_node2/set_fact 11792 1727096160.56398: done queuing things up, now waiting for results queue to drain 11792 1727096160.56400: waiting for pending results... 11792 1727096160.56928: running TaskExecutor() for managed_node2/TASK: Reset bond options to assert 11792 1727096160.57206: in run() - task 0afff68d-5257-d9c7-3fc0-00000000000f 11792 1727096160.57366: variable 'ansible_search_path' from source: unknown 11792 1727096160.57372: calling self._execute() 11792 1727096160.57619: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.57631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.57649: variable 'omit' from source: magic vars 11792 1727096160.58755: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.59374: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.59377: variable 'omit' from source: magic vars 11792 1727096160.59380: variable 'omit' from source: magic vars 11792 1727096160.59384: variable 'dhcp_interface1' from source: play vars 11792 1727096160.59386: variable 'dhcp_interface1' from source: play vars 11792 1727096160.59574: variable 'omit' from source: magic vars 11792 1727096160.59645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096160.59692: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.59717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096160.59740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.59759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.59797: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.59881: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.59890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.60107: Set connection var ansible_timeout to 10 11792 1727096160.60123: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.60137: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.60149: Set connection var ansible_pipelining to False 11792 1727096160.60179: Set connection var ansible_shell_type to sh 11792 1727096160.60188: Set connection var ansible_connection to ssh 11792 1727096160.60473: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.60477: variable 'ansible_connection' from source: unknown 11792 1727096160.60480: variable 'ansible_module_compression' from source: unknown 11792 1727096160.60483: variable 'ansible_shell_type' from source: unknown 11792 1727096160.60486: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.60488: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.60491: variable 'ansible_pipelining' from source: unknown 11792 1727096160.60494: variable 'ansible_timeout' from source: unknown 11792 1727096160.60496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.60593: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.60691: variable 'omit' from source: magic vars 11792 1727096160.60704: starting attempt loop 11792 1727096160.60711: running the handler 11792 1727096160.60727: handler run complete 11792 1727096160.60758: attempt loop complete, returning result 11792 1727096160.60769: _execute() done 11792 1727096160.60777: dumping result to json 11792 1727096160.60860: done dumping result, returning 11792 1727096160.60872: done running TaskExecutor() for managed_node2/TASK: Reset bond options to assert [0afff68d-5257-d9c7-3fc0-00000000000f] 11792 1727096160.60875: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000000f 11792 1727096160.60992: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000000f 11792 1727096160.60996: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "bond_options_to_assert": [ { "key": "mode", "value": "active-backup" }, { "key": "arp_interval", "value": "60" }, { "key": "arp_ip_target", "value": "192.0.2.128" }, { "key": "arp_validate", "value": "none" }, { "key": "primary", "value": "test1" } ] }, "changed": false } 11792 1727096160.61145: no more pending results, returning what we have 11792 1727096160.61152: results queue empty 11792 1727096160.61153: checking for any_errors_fatal 11792 1727096160.61169: done checking for any_errors_fatal 11792 1727096160.61170: checking for max_fail_percentage 11792 1727096160.61172: done checking for max_fail_percentage 11792 1727096160.61173: checking to see if all hosts have failed and the running result is not ok 11792 1727096160.61174: done checking to see if all hosts have failed 11792 1727096160.61175: getting the remaining hosts for this loop 11792 1727096160.61176: done getting the remaining hosts for this loop 11792 1727096160.61181: getting the next task for host managed_node2 11792 1727096160.61190: done getting next task for host managed_node2 11792 1727096160.61193: ^ task is: TASK: Include the task 'run_test.yml' 11792 1727096160.61195: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096160.61199: getting variables 11792 1727096160.61201: in VariableManager get_vars() 11792 1727096160.61243: Calling all_inventory to load vars for managed_node2 11792 1727096160.61246: Calling groups_inventory to load vars for managed_node2 11792 1727096160.61249: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096160.61262: Calling all_plugins_play to load vars for managed_node2 11792 1727096160.61266: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096160.61474: Calling groups_plugins_play to load vars for managed_node2 11792 1727096160.63514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096160.66137: done with get_vars() 11792 1727096160.66175: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:72 Monday 23 September 2024 08:56:00 -0400 (0:00:00.106) 0:00:42.941 ****** 11792 1727096160.66279: entering _queue_task() for managed_node2/include_tasks 11792 1727096160.66650: worker is 1 (out of 1 available) 11792 1727096160.66662: exiting _queue_task() for managed_node2/include_tasks 11792 1727096160.66680: done queuing things up, now waiting for results queue to drain 11792 1727096160.66682: waiting for pending results... 11792 1727096160.67117: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 11792 1727096160.67122: in run() - task 0afff68d-5257-d9c7-3fc0-000000000011 11792 1727096160.67134: variable 'ansible_search_path' from source: unknown 11792 1727096160.67182: calling self._execute() 11792 1727096160.67294: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.67302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.67312: variable 'omit' from source: magic vars 11792 1727096160.67861: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.67875: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.67881: _execute() done 11792 1727096160.67884: dumping result to json 11792 1727096160.67886: done dumping result, returning 11792 1727096160.67906: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0afff68d-5257-d9c7-3fc0-000000000011] 11792 1727096160.68124: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000011 11792 1727096160.68222: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000011 11792 1727096160.68228: WORKER PROCESS EXITING 11792 1727096160.68261: no more pending results, returning what we have 11792 1727096160.68266: in VariableManager get_vars() 11792 1727096160.68322: Calling all_inventory to load vars for managed_node2 11792 1727096160.68325: Calling groups_inventory to load vars for managed_node2 11792 1727096160.68328: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096160.68343: Calling all_plugins_play to load vars for managed_node2 11792 1727096160.68346: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096160.68349: Calling groups_plugins_play to load vars for managed_node2 11792 1727096160.70665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096160.72252: done with get_vars() 11792 1727096160.72284: variable 'ansible_search_path' from source: unknown 11792 1727096160.72302: we have included files to process 11792 1727096160.72303: generating all_blocks data 11792 1727096160.72308: done generating all_blocks data 11792 1727096160.72313: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11792 1727096160.72315: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11792 1727096160.72317: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11792 1727096160.72728: in VariableManager get_vars() 11792 1727096160.72748: done with get_vars() 11792 1727096160.72791: in VariableManager get_vars() 11792 1727096160.72818: done with get_vars() 11792 1727096160.72859: in VariableManager get_vars() 11792 1727096160.72884: done with get_vars() 11792 1727096160.72933: in VariableManager get_vars() 11792 1727096160.72954: done with get_vars() 11792 1727096160.72998: in VariableManager get_vars() 11792 1727096160.73017: done with get_vars() 11792 1727096160.73450: in VariableManager get_vars() 11792 1727096160.73476: done with get_vars() 11792 1727096160.73489: done processing included file 11792 1727096160.73491: iterating over new_blocks loaded from include file 11792 1727096160.73493: in VariableManager get_vars() 11792 1727096160.73507: done with get_vars() 11792 1727096160.73509: filtering new block on tags 11792 1727096160.73621: done filtering new block on tags 11792 1727096160.73625: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 11792 1727096160.73630: extending task lists for all hosts with included blocks 11792 1727096160.73666: done extending task lists 11792 1727096160.73670: done processing included files 11792 1727096160.73671: results queue empty 11792 1727096160.73671: checking for any_errors_fatal 11792 1727096160.73674: done checking for any_errors_fatal 11792 1727096160.73679: checking for max_fail_percentage 11792 1727096160.73681: done checking for max_fail_percentage 11792 1727096160.73682: checking to see if all hosts have failed and the running result is not ok 11792 1727096160.73682: done checking to see if all hosts have failed 11792 1727096160.73683: getting the remaining hosts for this loop 11792 1727096160.73684: done getting the remaining hosts for this loop 11792 1727096160.73687: getting the next task for host managed_node2 11792 1727096160.73691: done getting next task for host managed_node2 11792 1727096160.73693: ^ task is: TASK: TEST: {{ lsr_description }} 11792 1727096160.73696: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096160.73698: getting variables 11792 1727096160.73699: in VariableManager get_vars() 11792 1727096160.73710: Calling all_inventory to load vars for managed_node2 11792 1727096160.73712: Calling groups_inventory to load vars for managed_node2 11792 1727096160.73715: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096160.73721: Calling all_plugins_play to load vars for managed_node2 11792 1727096160.73723: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096160.73727: Calling groups_plugins_play to load vars for managed_node2 11792 1727096160.75045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096160.76674: done with get_vars() 11792 1727096160.76707: done getting variables 11792 1727096160.76761: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096160.76886: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Monday 23 September 2024 08:56:00 -0400 (0:00:00.106) 0:00:43.048 ****** 11792 1727096160.76916: entering _queue_task() for managed_node2/debug 11792 1727096160.77314: worker is 1 (out of 1 available) 11792 1727096160.77326: exiting _queue_task() for managed_node2/debug 11792 1727096160.77340: done queuing things up, now waiting for results queue to drain 11792 1727096160.77341: waiting for pending results... 11792 1727096160.78087: running TaskExecutor() for managed_node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 11792 1727096160.78094: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008ea 11792 1727096160.78097: variable 'ansible_search_path' from source: unknown 11792 1727096160.78100: variable 'ansible_search_path' from source: unknown 11792 1727096160.78103: calling self._execute() 11792 1727096160.78106: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.78109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.78112: variable 'omit' from source: magic vars 11792 1727096160.78458: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.78472: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.78480: variable 'omit' from source: magic vars 11792 1727096160.78518: variable 'omit' from source: magic vars 11792 1727096160.78734: variable 'lsr_description' from source: include params 11792 1727096160.78738: variable 'omit' from source: magic vars 11792 1727096160.78741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096160.78743: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.78760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096160.78781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.78801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.78832: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.78837: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.78840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.78955: Set connection var ansible_timeout to 10 11792 1727096160.78958: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.78971: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.78974: Set connection var ansible_pipelining to False 11792 1727096160.78976: Set connection var ansible_shell_type to sh 11792 1727096160.78979: Set connection var ansible_connection to ssh 11792 1727096160.79005: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.79008: variable 'ansible_connection' from source: unknown 11792 1727096160.79011: variable 'ansible_module_compression' from source: unknown 11792 1727096160.79014: variable 'ansible_shell_type' from source: unknown 11792 1727096160.79017: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.79019: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.79023: variable 'ansible_pipelining' from source: unknown 11792 1727096160.79027: variable 'ansible_timeout' from source: unknown 11792 1727096160.79030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.79186: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.79194: variable 'omit' from source: magic vars 11792 1727096160.79201: starting attempt loop 11792 1727096160.79204: running the handler 11792 1727096160.79295: handler run complete 11792 1727096160.79298: attempt loop complete, returning result 11792 1727096160.79300: _execute() done 11792 1727096160.79303: dumping result to json 11792 1727096160.79305: done dumping result, returning 11792 1727096160.79308: done running TaskExecutor() for managed_node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [0afff68d-5257-d9c7-3fc0-0000000008ea] 11792 1727096160.79310: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ea 11792 1727096160.79439: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ea 11792 1727096160.79443: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 11792 1727096160.79497: no more pending results, returning what we have 11792 1727096160.79502: results queue empty 11792 1727096160.79671: checking for any_errors_fatal 11792 1727096160.79673: done checking for any_errors_fatal 11792 1727096160.79674: checking for max_fail_percentage 11792 1727096160.79676: done checking for max_fail_percentage 11792 1727096160.79677: checking to see if all hosts have failed and the running result is not ok 11792 1727096160.79677: done checking to see if all hosts have failed 11792 1727096160.79678: getting the remaining hosts for this loop 11792 1727096160.79680: done getting the remaining hosts for this loop 11792 1727096160.79683: getting the next task for host managed_node2 11792 1727096160.79689: done getting next task for host managed_node2 11792 1727096160.79692: ^ task is: TASK: Show item 11792 1727096160.79695: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096160.79699: getting variables 11792 1727096160.79700: in VariableManager get_vars() 11792 1727096160.79738: Calling all_inventory to load vars for managed_node2 11792 1727096160.79741: Calling groups_inventory to load vars for managed_node2 11792 1727096160.79743: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096160.79755: Calling all_plugins_play to load vars for managed_node2 11792 1727096160.79758: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096160.79762: Calling groups_plugins_play to load vars for managed_node2 11792 1727096160.81193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096160.82825: done with get_vars() 11792 1727096160.82857: done getting variables 11792 1727096160.82926: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Monday 23 September 2024 08:56:00 -0400 (0:00:00.060) 0:00:43.108 ****** 11792 1727096160.82964: entering _queue_task() for managed_node2/debug 11792 1727096160.83346: worker is 1 (out of 1 available) 11792 1727096160.83579: exiting _queue_task() for managed_node2/debug 11792 1727096160.83590: done queuing things up, now waiting for results queue to drain 11792 1727096160.83591: waiting for pending results... 11792 1727096160.83986: running TaskExecutor() for managed_node2/TASK: Show item 11792 1727096160.83991: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008eb 11792 1727096160.83994: variable 'ansible_search_path' from source: unknown 11792 1727096160.83996: variable 'ansible_search_path' from source: unknown 11792 1727096160.83999: variable 'omit' from source: magic vars 11792 1727096160.84007: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.84019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.84037: variable 'omit' from source: magic vars 11792 1727096160.84417: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.84428: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.84433: variable 'omit' from source: magic vars 11792 1727096160.84479: variable 'omit' from source: magic vars 11792 1727096160.84522: variable 'item' from source: unknown 11792 1727096160.84590: variable 'item' from source: unknown 11792 1727096160.84606: variable 'omit' from source: magic vars 11792 1727096160.84645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096160.84685: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.84711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096160.84727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.84736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.84766: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.84772: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.84775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.84946: Set connection var ansible_timeout to 10 11792 1727096160.84949: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.84956: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.84959: Set connection var ansible_pipelining to False 11792 1727096160.84962: Set connection var ansible_shell_type to sh 11792 1727096160.84964: Set connection var ansible_connection to ssh 11792 1727096160.84971: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.84975: variable 'ansible_connection' from source: unknown 11792 1727096160.84978: variable 'ansible_module_compression' from source: unknown 11792 1727096160.84980: variable 'ansible_shell_type' from source: unknown 11792 1727096160.84982: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.84984: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.84986: variable 'ansible_pipelining' from source: unknown 11792 1727096160.84988: variable 'ansible_timeout' from source: unknown 11792 1727096160.84991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.85203: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.85207: variable 'omit' from source: magic vars 11792 1727096160.85210: starting attempt loop 11792 1727096160.85212: running the handler 11792 1727096160.85215: variable 'lsr_description' from source: include params 11792 1727096160.85280: variable 'lsr_description' from source: include params 11792 1727096160.85290: handler run complete 11792 1727096160.85311: attempt loop complete, returning result 11792 1727096160.85324: variable 'item' from source: unknown 11792 1727096160.85402: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 11792 1727096160.85569: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.85574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.85577: variable 'omit' from source: magic vars 11792 1727096160.85875: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.85878: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.85881: variable 'omit' from source: magic vars 11792 1727096160.85883: variable 'omit' from source: magic vars 11792 1727096160.85885: variable 'item' from source: unknown 11792 1727096160.85887: variable 'item' from source: unknown 11792 1727096160.85890: variable 'omit' from source: magic vars 11792 1727096160.85892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.85894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.85897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.85911: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.85915: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.85918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.86002: Set connection var ansible_timeout to 10 11792 1727096160.86008: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.86022: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.86027: Set connection var ansible_pipelining to False 11792 1727096160.86030: Set connection var ansible_shell_type to sh 11792 1727096160.86032: Set connection var ansible_connection to ssh 11792 1727096160.86091: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.86095: variable 'ansible_connection' from source: unknown 11792 1727096160.86097: variable 'ansible_module_compression' from source: unknown 11792 1727096160.86099: variable 'ansible_shell_type' from source: unknown 11792 1727096160.86101: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.86103: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.86105: variable 'ansible_pipelining' from source: unknown 11792 1727096160.86107: variable 'ansible_timeout' from source: unknown 11792 1727096160.86109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.86177: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.86187: variable 'omit' from source: magic vars 11792 1727096160.86190: starting attempt loop 11792 1727096160.86283: running the handler 11792 1727096160.86289: variable 'lsr_setup' from source: include params 11792 1727096160.86303: variable 'lsr_setup' from source: include params 11792 1727096160.86352: handler run complete 11792 1727096160.86365: attempt loop complete, returning result 11792 1727096160.86419: variable 'item' from source: unknown 11792 1727096160.86450: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 11792 1727096160.86705: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.86710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.86713: variable 'omit' from source: magic vars 11792 1727096160.86716: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.86833: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.86836: variable 'omit' from source: magic vars 11792 1727096160.86839: variable 'omit' from source: magic vars 11792 1727096160.86841: variable 'item' from source: unknown 11792 1727096160.86856: variable 'item' from source: unknown 11792 1727096160.86875: variable 'omit' from source: magic vars 11792 1727096160.86895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.86905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.86912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.86926: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.86929: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.86931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.87017: Set connection var ansible_timeout to 10 11792 1727096160.87024: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.87034: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.87039: Set connection var ansible_pipelining to False 11792 1727096160.87046: Set connection var ansible_shell_type to sh 11792 1727096160.87050: Set connection var ansible_connection to ssh 11792 1727096160.87161: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.87165: variable 'ansible_connection' from source: unknown 11792 1727096160.87170: variable 'ansible_module_compression' from source: unknown 11792 1727096160.87173: variable 'ansible_shell_type' from source: unknown 11792 1727096160.87176: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.87178: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.87181: variable 'ansible_pipelining' from source: unknown 11792 1727096160.87184: variable 'ansible_timeout' from source: unknown 11792 1727096160.87186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.87206: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.87215: variable 'omit' from source: magic vars 11792 1727096160.87218: starting attempt loop 11792 1727096160.87220: running the handler 11792 1727096160.87242: variable 'lsr_test' from source: include params 11792 1727096160.87318: variable 'lsr_test' from source: include params 11792 1727096160.87337: handler run complete 11792 1727096160.87349: attempt loop complete, returning result 11792 1727096160.87364: variable 'item' from source: unknown 11792 1727096160.87443: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile_reconfigure.yml" ] } 11792 1727096160.87556: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.87560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.87564: variable 'omit' from source: magic vars 11792 1727096160.87809: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.87812: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.87814: variable 'omit' from source: magic vars 11792 1727096160.87817: variable 'omit' from source: magic vars 11792 1727096160.87819: variable 'item' from source: unknown 11792 1727096160.87880: variable 'item' from source: unknown 11792 1727096160.87883: variable 'omit' from source: magic vars 11792 1727096160.87886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.87899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.87906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.88024: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.88028: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.88030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.88032: Set connection var ansible_timeout to 10 11792 1727096160.88034: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.88036: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.88038: Set connection var ansible_pipelining to False 11792 1727096160.88040: Set connection var ansible_shell_type to sh 11792 1727096160.88042: Set connection var ansible_connection to ssh 11792 1727096160.88057: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.88059: variable 'ansible_connection' from source: unknown 11792 1727096160.88062: variable 'ansible_module_compression' from source: unknown 11792 1727096160.88064: variable 'ansible_shell_type' from source: unknown 11792 1727096160.88066: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.88072: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.88075: variable 'ansible_pipelining' from source: unknown 11792 1727096160.88077: variable 'ansible_timeout' from source: unknown 11792 1727096160.88079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.88168: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.88177: variable 'omit' from source: magic vars 11792 1727096160.88180: starting attempt loop 11792 1727096160.88183: running the handler 11792 1727096160.88202: variable 'lsr_assert' from source: include params 11792 1727096160.88266: variable 'lsr_assert' from source: include params 11792 1727096160.88288: handler run complete 11792 1727096160.88300: attempt loop complete, returning result 11792 1727096160.88314: variable 'item' from source: unknown 11792 1727096160.88377: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_bond_options.yml" ] } 11792 1727096160.88584: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.88587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.88590: variable 'omit' from source: magic vars 11792 1727096160.88681: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.88685: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.88690: variable 'omit' from source: magic vars 11792 1727096160.88712: variable 'omit' from source: magic vars 11792 1727096160.88783: variable 'item' from source: unknown 11792 1727096160.88804: variable 'item' from source: unknown 11792 1727096160.88824: variable 'omit' from source: magic vars 11792 1727096160.88841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.88848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.88857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.88891: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.88894: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.88896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.88944: Set connection var ansible_timeout to 10 11792 1727096160.88950: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.88999: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.89003: Set connection var ansible_pipelining to False 11792 1727096160.89005: Set connection var ansible_shell_type to sh 11792 1727096160.89007: Set connection var ansible_connection to ssh 11792 1727096160.89009: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.89011: variable 'ansible_connection' from source: unknown 11792 1727096160.89013: variable 'ansible_module_compression' from source: unknown 11792 1727096160.89015: variable 'ansible_shell_type' from source: unknown 11792 1727096160.89017: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.89019: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.89020: variable 'ansible_pipelining' from source: unknown 11792 1727096160.89022: variable 'ansible_timeout' from source: unknown 11792 1727096160.89024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.89108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.89112: variable 'omit' from source: magic vars 11792 1727096160.89117: starting attempt loop 11792 1727096160.89120: running the handler 11792 1727096160.89230: handler run complete 11792 1727096160.89249: attempt loop complete, returning result 11792 1727096160.89264: variable 'item' from source: unknown 11792 1727096160.89327: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 11792 1727096160.89483: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.89488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.89491: variable 'omit' from source: magic vars 11792 1727096160.89656: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.89660: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.89662: variable 'omit' from source: magic vars 11792 1727096160.89665: variable 'omit' from source: magic vars 11792 1727096160.89668: variable 'item' from source: unknown 11792 1727096160.89704: variable 'item' from source: unknown 11792 1727096160.89722: variable 'omit' from source: magic vars 11792 1727096160.89741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.89749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.89761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.89871: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.89875: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.89877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.89880: Set connection var ansible_timeout to 10 11792 1727096160.89882: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.89884: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.89886: Set connection var ansible_pipelining to False 11792 1727096160.89888: Set connection var ansible_shell_type to sh 11792 1727096160.89890: Set connection var ansible_connection to ssh 11792 1727096160.89908: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.89911: variable 'ansible_connection' from source: unknown 11792 1727096160.89913: variable 'ansible_module_compression' from source: unknown 11792 1727096160.89915: variable 'ansible_shell_type' from source: unknown 11792 1727096160.89917: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.89921: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.89926: variable 'ansible_pipelining' from source: unknown 11792 1727096160.89934: variable 'ansible_timeout' from source: unknown 11792 1727096160.89936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.90032: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.90041: variable 'omit' from source: magic vars 11792 1727096160.90044: starting attempt loop 11792 1727096160.90047: running the handler 11792 1727096160.90070: variable 'lsr_fail_debug' from source: play vars 11792 1727096160.90134: variable 'lsr_fail_debug' from source: play vars 11792 1727096160.90148: handler run complete 11792 1727096160.90161: attempt loop complete, returning result 11792 1727096160.90178: variable 'item' from source: unknown 11792 1727096160.90239: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 11792 1727096160.90331: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.90335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.90338: variable 'omit' from source: magic vars 11792 1727096160.90550: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.90556: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.90559: variable 'omit' from source: magic vars 11792 1727096160.90562: variable 'omit' from source: magic vars 11792 1727096160.90565: variable 'item' from source: unknown 11792 1727096160.90634: variable 'item' from source: unknown 11792 1727096160.90649: variable 'omit' from source: magic vars 11792 1727096160.90735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096160.90738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.90745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096160.90748: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096160.90750: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.90755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.90779: Set connection var ansible_timeout to 10 11792 1727096160.90787: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096160.90800: Set connection var ansible_shell_executable to /bin/sh 11792 1727096160.90806: Set connection var ansible_pipelining to False 11792 1727096160.90808: Set connection var ansible_shell_type to sh 11792 1727096160.90811: Set connection var ansible_connection to ssh 11792 1727096160.90835: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.90842: variable 'ansible_connection' from source: unknown 11792 1727096160.90845: variable 'ansible_module_compression' from source: unknown 11792 1727096160.90847: variable 'ansible_shell_type' from source: unknown 11792 1727096160.90849: variable 'ansible_shell_executable' from source: unknown 11792 1727096160.90851: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.90856: variable 'ansible_pipelining' from source: unknown 11792 1727096160.90858: variable 'ansible_timeout' from source: unknown 11792 1727096160.90860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.91045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096160.91048: variable 'omit' from source: magic vars 11792 1727096160.91050: starting attempt loop 11792 1727096160.91270: running the handler 11792 1727096160.91275: variable 'lsr_cleanup' from source: include params 11792 1727096160.91278: variable 'lsr_cleanup' from source: include params 11792 1727096160.91280: handler run complete 11792 1727096160.91282: attempt loop complete, returning result 11792 1727096160.91284: variable 'item' from source: unknown 11792 1727096160.91285: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml", "tasks/check_network_dns.yml" ] } 11792 1727096160.91350: dumping result to json 11792 1727096160.91355: done dumping result, returning 11792 1727096160.91357: done running TaskExecutor() for managed_node2/TASK: Show item [0afff68d-5257-d9c7-3fc0-0000000008eb] 11792 1727096160.91360: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008eb 11792 1727096160.91403: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008eb 11792 1727096160.91406: WORKER PROCESS EXITING 11792 1727096160.91512: no more pending results, returning what we have 11792 1727096160.91516: results queue empty 11792 1727096160.91516: checking for any_errors_fatal 11792 1727096160.91521: done checking for any_errors_fatal 11792 1727096160.91522: checking for max_fail_percentage 11792 1727096160.91523: done checking for max_fail_percentage 11792 1727096160.91524: checking to see if all hosts have failed and the running result is not ok 11792 1727096160.91525: done checking to see if all hosts have failed 11792 1727096160.91525: getting the remaining hosts for this loop 11792 1727096160.91527: done getting the remaining hosts for this loop 11792 1727096160.91530: getting the next task for host managed_node2 11792 1727096160.91536: done getting next task for host managed_node2 11792 1727096160.91538: ^ task is: TASK: Include the task 'show_interfaces.yml' 11792 1727096160.91541: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096160.91544: getting variables 11792 1727096160.91545: in VariableManager get_vars() 11792 1727096160.91583: Calling all_inventory to load vars for managed_node2 11792 1727096160.91586: Calling groups_inventory to load vars for managed_node2 11792 1727096160.91588: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096160.91599: Calling all_plugins_play to load vars for managed_node2 11792 1727096160.91602: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096160.91604: Calling groups_plugins_play to load vars for managed_node2 11792 1727096160.94191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096160.95898: done with get_vars() 11792 1727096160.95935: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Monday 23 September 2024 08:56:00 -0400 (0:00:00.130) 0:00:43.239 ****** 11792 1727096160.96047: entering _queue_task() for managed_node2/include_tasks 11792 1727096160.96698: worker is 1 (out of 1 available) 11792 1727096160.96711: exiting _queue_task() for managed_node2/include_tasks 11792 1727096160.96727: done queuing things up, now waiting for results queue to drain 11792 1727096160.96728: waiting for pending results... 11792 1727096160.97288: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 11792 1727096160.97294: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008ec 11792 1727096160.97298: variable 'ansible_search_path' from source: unknown 11792 1727096160.97300: variable 'ansible_search_path' from source: unknown 11792 1727096160.97303: calling self._execute() 11792 1727096160.97403: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096160.97417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096160.97441: variable 'omit' from source: magic vars 11792 1727096160.97877: variable 'ansible_distribution_major_version' from source: facts 11792 1727096160.97974: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096160.97984: _execute() done 11792 1727096160.97988: dumping result to json 11792 1727096160.97991: done dumping result, returning 11792 1727096160.97993: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-d9c7-3fc0-0000000008ec] 11792 1727096160.97995: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ec 11792 1727096160.98236: no more pending results, returning what we have 11792 1727096160.98243: in VariableManager get_vars() 11792 1727096160.98303: Calling all_inventory to load vars for managed_node2 11792 1727096160.98307: Calling groups_inventory to load vars for managed_node2 11792 1727096160.98309: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096160.98325: Calling all_plugins_play to load vars for managed_node2 11792 1727096160.98329: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096160.98332: Calling groups_plugins_play to load vars for managed_node2 11792 1727096160.98881: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ec 11792 1727096160.98884: WORKER PROCESS EXITING 11792 1727096161.00299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.02178: done with get_vars() 11792 1727096161.02208: variable 'ansible_search_path' from source: unknown 11792 1727096161.02210: variable 'ansible_search_path' from source: unknown 11792 1727096161.02257: we have included files to process 11792 1727096161.02258: generating all_blocks data 11792 1727096161.02260: done generating all_blocks data 11792 1727096161.02270: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11792 1727096161.02271: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11792 1727096161.02275: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11792 1727096161.02391: in VariableManager get_vars() 11792 1727096161.02416: done with get_vars() 11792 1727096161.02537: done processing included file 11792 1727096161.02539: iterating over new_blocks loaded from include file 11792 1727096161.02541: in VariableManager get_vars() 11792 1727096161.02563: done with get_vars() 11792 1727096161.02565: filtering new block on tags 11792 1727096161.02603: done filtering new block on tags 11792 1727096161.02606: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 11792 1727096161.02611: extending task lists for all hosts with included blocks 11792 1727096161.03077: done extending task lists 11792 1727096161.03078: done processing included files 11792 1727096161.03079: results queue empty 11792 1727096161.03080: checking for any_errors_fatal 11792 1727096161.03087: done checking for any_errors_fatal 11792 1727096161.03088: checking for max_fail_percentage 11792 1727096161.03089: done checking for max_fail_percentage 11792 1727096161.03089: checking to see if all hosts have failed and the running result is not ok 11792 1727096161.03090: done checking to see if all hosts have failed 11792 1727096161.03091: getting the remaining hosts for this loop 11792 1727096161.03092: done getting the remaining hosts for this loop 11792 1727096161.03095: getting the next task for host managed_node2 11792 1727096161.03099: done getting next task for host managed_node2 11792 1727096161.03101: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 11792 1727096161.03104: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096161.03107: getting variables 11792 1727096161.03108: in VariableManager get_vars() 11792 1727096161.03121: Calling all_inventory to load vars for managed_node2 11792 1727096161.03123: Calling groups_inventory to load vars for managed_node2 11792 1727096161.03126: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096161.03131: Calling all_plugins_play to load vars for managed_node2 11792 1727096161.03134: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096161.03136: Calling groups_plugins_play to load vars for managed_node2 11792 1727096161.04450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.06008: done with get_vars() 11792 1727096161.06041: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:56:01 -0400 (0:00:00.100) 0:00:43.340 ****** 11792 1727096161.06131: entering _queue_task() for managed_node2/include_tasks 11792 1727096161.06506: worker is 1 (out of 1 available) 11792 1727096161.06518: exiting _queue_task() for managed_node2/include_tasks 11792 1727096161.06531: done queuing things up, now waiting for results queue to drain 11792 1727096161.06533: waiting for pending results... 11792 1727096161.06834: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 11792 1727096161.06961: in run() - task 0afff68d-5257-d9c7-3fc0-000000000913 11792 1727096161.06985: variable 'ansible_search_path' from source: unknown 11792 1727096161.06997: variable 'ansible_search_path' from source: unknown 11792 1727096161.07173: calling self._execute() 11792 1727096161.07176: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.07179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.07182: variable 'omit' from source: magic vars 11792 1727096161.07561: variable 'ansible_distribution_major_version' from source: facts 11792 1727096161.07583: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096161.07595: _execute() done 11792 1727096161.07604: dumping result to json 11792 1727096161.07613: done dumping result, returning 11792 1727096161.07630: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-d9c7-3fc0-000000000913] 11792 1727096161.07640: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000913 11792 1727096161.07906: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000913 11792 1727096161.07910: WORKER PROCESS EXITING 11792 1727096161.07942: no more pending results, returning what we have 11792 1727096161.07947: in VariableManager get_vars() 11792 1727096161.08004: Calling all_inventory to load vars for managed_node2 11792 1727096161.08008: Calling groups_inventory to load vars for managed_node2 11792 1727096161.08010: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096161.08026: Calling all_plugins_play to load vars for managed_node2 11792 1727096161.08030: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096161.08034: Calling groups_plugins_play to load vars for managed_node2 11792 1727096161.09638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.11373: done with get_vars() 11792 1727096161.11398: variable 'ansible_search_path' from source: unknown 11792 1727096161.11400: variable 'ansible_search_path' from source: unknown 11792 1727096161.11441: we have included files to process 11792 1727096161.11442: generating all_blocks data 11792 1727096161.11444: done generating all_blocks data 11792 1727096161.11446: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11792 1727096161.11447: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11792 1727096161.11449: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11792 1727096161.11732: done processing included file 11792 1727096161.11734: iterating over new_blocks loaded from include file 11792 1727096161.11736: in VariableManager get_vars() 11792 1727096161.11760: done with get_vars() 11792 1727096161.11762: filtering new block on tags 11792 1727096161.11802: done filtering new block on tags 11792 1727096161.11805: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 11792 1727096161.11811: extending task lists for all hosts with included blocks 11792 1727096161.11978: done extending task lists 11792 1727096161.11980: done processing included files 11792 1727096161.11981: results queue empty 11792 1727096161.11982: checking for any_errors_fatal 11792 1727096161.11985: done checking for any_errors_fatal 11792 1727096161.11986: checking for max_fail_percentage 11792 1727096161.11987: done checking for max_fail_percentage 11792 1727096161.11988: checking to see if all hosts have failed and the running result is not ok 11792 1727096161.11988: done checking to see if all hosts have failed 11792 1727096161.11989: getting the remaining hosts for this loop 11792 1727096161.11990: done getting the remaining hosts for this loop 11792 1727096161.11993: getting the next task for host managed_node2 11792 1727096161.11997: done getting next task for host managed_node2 11792 1727096161.12000: ^ task is: TASK: Gather current interface info 11792 1727096161.12003: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096161.12005: getting variables 11792 1727096161.12006: in VariableManager get_vars() 11792 1727096161.12017: Calling all_inventory to load vars for managed_node2 11792 1727096161.12020: Calling groups_inventory to load vars for managed_node2 11792 1727096161.12022: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096161.12027: Calling all_plugins_play to load vars for managed_node2 11792 1727096161.12030: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096161.12033: Calling groups_plugins_play to load vars for managed_node2 11792 1727096161.13217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.14809: done with get_vars() 11792 1727096161.14844: done getting variables 11792 1727096161.14897: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:56:01 -0400 (0:00:00.088) 0:00:43.428 ****** 11792 1727096161.14934: entering _queue_task() for managed_node2/command 11792 1727096161.15325: worker is 1 (out of 1 available) 11792 1727096161.15339: exiting _queue_task() for managed_node2/command 11792 1727096161.15356: done queuing things up, now waiting for results queue to drain 11792 1727096161.15358: waiting for pending results... 11792 1727096161.15595: running TaskExecutor() for managed_node2/TASK: Gather current interface info 11792 1727096161.15830: in run() - task 0afff68d-5257-d9c7-3fc0-00000000094e 11792 1727096161.15835: variable 'ansible_search_path' from source: unknown 11792 1727096161.15838: variable 'ansible_search_path' from source: unknown 11792 1727096161.15841: calling self._execute() 11792 1727096161.15920: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.15942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.15957: variable 'omit' from source: magic vars 11792 1727096161.16347: variable 'ansible_distribution_major_version' from source: facts 11792 1727096161.16372: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096161.16383: variable 'omit' from source: magic vars 11792 1727096161.16433: variable 'omit' from source: magic vars 11792 1727096161.16474: variable 'omit' from source: magic vars 11792 1727096161.16521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096161.16559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096161.16594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096161.16615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096161.16627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096161.16672: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096161.16676: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.16678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.16802: Set connection var ansible_timeout to 10 11792 1727096161.16806: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096161.16808: Set connection var ansible_shell_executable to /bin/sh 11792 1727096161.16810: Set connection var ansible_pipelining to False 11792 1727096161.16815: Set connection var ansible_shell_type to sh 11792 1727096161.16822: Set connection var ansible_connection to ssh 11792 1727096161.16872: variable 'ansible_shell_executable' from source: unknown 11792 1727096161.16876: variable 'ansible_connection' from source: unknown 11792 1727096161.16879: variable 'ansible_module_compression' from source: unknown 11792 1727096161.16882: variable 'ansible_shell_type' from source: unknown 11792 1727096161.16885: variable 'ansible_shell_executable' from source: unknown 11792 1727096161.16887: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.16890: variable 'ansible_pipelining' from source: unknown 11792 1727096161.16893: variable 'ansible_timeout' from source: unknown 11792 1727096161.16896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.17073: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096161.17077: variable 'omit' from source: magic vars 11792 1727096161.17127: starting attempt loop 11792 1727096161.17130: running the handler 11792 1727096161.17133: _low_level_execute_command(): starting 11792 1727096161.17135: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096161.17903: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096161.18008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.18355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.18398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.20163: stdout chunk (state=3): >>>/root <<< 11792 1727096161.20294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096161.20320: stdout chunk (state=3): >>><<< 11792 1727096161.20341: stderr chunk (state=3): >>><<< 11792 1727096161.20466: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096161.20474: _low_level_execute_command(): starting 11792 1727096161.20478: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573 `" && echo ansible-tmp-1727096161.2036562-13873-106236420854573="` echo /root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573 `" ) && sleep 0' 11792 1727096161.21064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096161.21081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096161.21095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096161.21111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096161.21149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096161.21172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096161.21255: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.21278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.21344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.23410: stdout chunk (state=3): >>>ansible-tmp-1727096161.2036562-13873-106236420854573=/root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573 <<< 11792 1727096161.23595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096161.23599: stdout chunk (state=3): >>><<< 11792 1727096161.23602: stderr chunk (state=3): >>><<< 11792 1727096161.23775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096161.2036562-13873-106236420854573=/root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096161.23779: variable 'ansible_module_compression' from source: unknown 11792 1727096161.23781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096161.23784: variable 'ansible_facts' from source: unknown 11792 1727096161.23853: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/AnsiballZ_command.py 11792 1727096161.24033: Sending initial data 11792 1727096161.24042: Sent initial data (156 bytes) 11792 1727096161.24780: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096161.24784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.24835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.24873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.26669: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096161.26679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/AnsiballZ_command.py" <<< 11792 1727096161.26932: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmptyj14bd1 /root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/AnsiballZ_command.py <<< 11792 1727096161.26937: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmptyj14bd1" to remote "/root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/AnsiballZ_command.py" <<< 11792 1727096161.27391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096161.27473: stderr chunk (state=3): >>><<< 11792 1727096161.27494: stdout chunk (state=3): >>><<< 11792 1727096161.27558: done transferring module to remote 11792 1727096161.27577: _low_level_execute_command(): starting 11792 1727096161.27588: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/ /root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/AnsiballZ_command.py && sleep 0' 11792 1727096161.28253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096161.28272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096161.28289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096161.28307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096161.28338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.28435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.28460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.28532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.30531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096161.30536: stdout chunk (state=3): >>><<< 11792 1727096161.30653: stderr chunk (state=3): >>><<< 11792 1727096161.30657: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096161.30660: _low_level_execute_command(): starting 11792 1727096161.30663: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/AnsiballZ_command.py && sleep 0' 11792 1727096161.31212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096161.31230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096161.31247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096161.31266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096161.31376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.31405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.31474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.47960: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:56:01.474553", "end": "2024-09-23 08:56:01.478247", "delta": "0:00:00.003694", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096161.49689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096161.49734: stderr chunk (state=3): >>><<< 11792 1727096161.49744: stdout chunk (state=3): >>><<< 11792 1727096161.49780: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:56:01.474553", "end": "2024-09-23 08:56:01.478247", "delta": "0:00:00.003694", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096161.49835: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096161.49849: _low_level_execute_command(): starting 11792 1727096161.49858: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096161.2036562-13873-106236420854573/ > /dev/null 2>&1 && sleep 0' 11792 1727096161.50747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096161.50805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.50875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096161.50899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.50933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.51001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.53077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096161.53081: stdout chunk (state=3): >>><<< 11792 1727096161.53084: stderr chunk (state=3): >>><<< 11792 1727096161.53087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096161.53089: handler run complete 11792 1727096161.53092: Evaluated conditional (False): False 11792 1727096161.53094: attempt loop complete, returning result 11792 1727096161.53095: _execute() done 11792 1727096161.53097: dumping result to json 11792 1727096161.53099: done dumping result, returning 11792 1727096161.53100: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-d9c7-3fc0-00000000094e] 11792 1727096161.53110: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000094e 11792 1727096161.53376: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000094e 11792 1727096161.53380: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003694", "end": "2024-09-23 08:56:01.478247", "rc": 0, "start": "2024-09-23 08:56:01.474553" } STDOUT: bonding_masters eth0 lo 11792 1727096161.53456: no more pending results, returning what we have 11792 1727096161.53460: results queue empty 11792 1727096161.53460: checking for any_errors_fatal 11792 1727096161.53462: done checking for any_errors_fatal 11792 1727096161.53463: checking for max_fail_percentage 11792 1727096161.53465: done checking for max_fail_percentage 11792 1727096161.53465: checking to see if all hosts have failed and the running result is not ok 11792 1727096161.53466: done checking to see if all hosts have failed 11792 1727096161.53479: getting the remaining hosts for this loop 11792 1727096161.53481: done getting the remaining hosts for this loop 11792 1727096161.53485: getting the next task for host managed_node2 11792 1727096161.53492: done getting next task for host managed_node2 11792 1727096161.53494: ^ task is: TASK: Set current_interfaces 11792 1727096161.53501: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096161.53506: getting variables 11792 1727096161.53507: in VariableManager get_vars() 11792 1727096161.53547: Calling all_inventory to load vars for managed_node2 11792 1727096161.53549: Calling groups_inventory to load vars for managed_node2 11792 1727096161.53551: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096161.53564: Calling all_plugins_play to load vars for managed_node2 11792 1727096161.53566: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096161.53675: Calling groups_plugins_play to load vars for managed_node2 11792 1727096161.55187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.56716: done with get_vars() 11792 1727096161.56752: done getting variables 11792 1727096161.56822: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:56:01 -0400 (0:00:00.419) 0:00:43.847 ****** 11792 1727096161.56859: entering _queue_task() for managed_node2/set_fact 11792 1727096161.57246: worker is 1 (out of 1 available) 11792 1727096161.57259: exiting _queue_task() for managed_node2/set_fact 11792 1727096161.57280: done queuing things up, now waiting for results queue to drain 11792 1727096161.57282: waiting for pending results... 11792 1727096161.57558: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 11792 1727096161.57716: in run() - task 0afff68d-5257-d9c7-3fc0-00000000094f 11792 1727096161.57741: variable 'ansible_search_path' from source: unknown 11792 1727096161.57750: variable 'ansible_search_path' from source: unknown 11792 1727096161.57806: calling self._execute() 11792 1727096161.58005: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.58009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.58012: variable 'omit' from source: magic vars 11792 1727096161.58359: variable 'ansible_distribution_major_version' from source: facts 11792 1727096161.58381: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096161.58394: variable 'omit' from source: magic vars 11792 1727096161.58463: variable 'omit' from source: magic vars 11792 1727096161.58587: variable '_current_interfaces' from source: set_fact 11792 1727096161.58672: variable 'omit' from source: magic vars 11792 1727096161.58717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096161.58761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096161.58796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096161.58820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096161.58837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096161.58983: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096161.58987: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.58990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.59016: Set connection var ansible_timeout to 10 11792 1727096161.59029: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096161.59041: Set connection var ansible_shell_executable to /bin/sh 11792 1727096161.59049: Set connection var ansible_pipelining to False 11792 1727096161.59057: Set connection var ansible_shell_type to sh 11792 1727096161.59063: Set connection var ansible_connection to ssh 11792 1727096161.59094: variable 'ansible_shell_executable' from source: unknown 11792 1727096161.59104: variable 'ansible_connection' from source: unknown 11792 1727096161.59111: variable 'ansible_module_compression' from source: unknown 11792 1727096161.59117: variable 'ansible_shell_type' from source: unknown 11792 1727096161.59122: variable 'ansible_shell_executable' from source: unknown 11792 1727096161.59127: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.59133: variable 'ansible_pipelining' from source: unknown 11792 1727096161.59138: variable 'ansible_timeout' from source: unknown 11792 1727096161.59199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.59293: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096161.59313: variable 'omit' from source: magic vars 11792 1727096161.59326: starting attempt loop 11792 1727096161.59332: running the handler 11792 1727096161.59345: handler run complete 11792 1727096161.59361: attempt loop complete, returning result 11792 1727096161.59367: _execute() done 11792 1727096161.59374: dumping result to json 11792 1727096161.59380: done dumping result, returning 11792 1727096161.59389: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-d9c7-3fc0-00000000094f] 11792 1727096161.59396: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000094f ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 11792 1727096161.59614: no more pending results, returning what we have 11792 1727096161.59617: results queue empty 11792 1727096161.59618: checking for any_errors_fatal 11792 1727096161.59629: done checking for any_errors_fatal 11792 1727096161.59630: checking for max_fail_percentage 11792 1727096161.59632: done checking for max_fail_percentage 11792 1727096161.59633: checking to see if all hosts have failed and the running result is not ok 11792 1727096161.59634: done checking to see if all hosts have failed 11792 1727096161.59634: getting the remaining hosts for this loop 11792 1727096161.59636: done getting the remaining hosts for this loop 11792 1727096161.59639: getting the next task for host managed_node2 11792 1727096161.59648: done getting next task for host managed_node2 11792 1727096161.59650: ^ task is: TASK: Show current_interfaces 11792 1727096161.59656: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096161.59660: getting variables 11792 1727096161.59661: in VariableManager get_vars() 11792 1727096161.59799: Calling all_inventory to load vars for managed_node2 11792 1727096161.59802: Calling groups_inventory to load vars for managed_node2 11792 1727096161.59804: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096161.59815: Calling all_plugins_play to load vars for managed_node2 11792 1727096161.59817: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096161.59820: Calling groups_plugins_play to load vars for managed_node2 11792 1727096161.60340: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000094f 11792 1727096161.60344: WORKER PROCESS EXITING 11792 1727096161.61505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.63211: done with get_vars() 11792 1727096161.63256: done getting variables 11792 1727096161.63322: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:56:01 -0400 (0:00:00.065) 0:00:43.912 ****** 11792 1727096161.63364: entering _queue_task() for managed_node2/debug 11792 1727096161.63764: worker is 1 (out of 1 available) 11792 1727096161.63786: exiting _queue_task() for managed_node2/debug 11792 1727096161.63801: done queuing things up, now waiting for results queue to drain 11792 1727096161.63803: waiting for pending results... 11792 1727096161.64097: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 11792 1727096161.64202: in run() - task 0afff68d-5257-d9c7-3fc0-000000000914 11792 1727096161.64219: variable 'ansible_search_path' from source: unknown 11792 1727096161.64223: variable 'ansible_search_path' from source: unknown 11792 1727096161.64303: calling self._execute() 11792 1727096161.64363: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.64371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.64382: variable 'omit' from source: magic vars 11792 1727096161.64766: variable 'ansible_distribution_major_version' from source: facts 11792 1727096161.64872: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096161.64876: variable 'omit' from source: magic vars 11792 1727096161.64879: variable 'omit' from source: magic vars 11792 1727096161.64925: variable 'current_interfaces' from source: set_fact 11792 1727096161.64955: variable 'omit' from source: magic vars 11792 1727096161.64996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096161.65033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096161.65051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096161.65071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096161.65082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096161.65113: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096161.65116: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.65119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.65216: Set connection var ansible_timeout to 10 11792 1727096161.65224: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096161.65233: Set connection var ansible_shell_executable to /bin/sh 11792 1727096161.65238: Set connection var ansible_pipelining to False 11792 1727096161.65241: Set connection var ansible_shell_type to sh 11792 1727096161.65243: Set connection var ansible_connection to ssh 11792 1727096161.65300: variable 'ansible_shell_executable' from source: unknown 11792 1727096161.65303: variable 'ansible_connection' from source: unknown 11792 1727096161.65306: variable 'ansible_module_compression' from source: unknown 11792 1727096161.65308: variable 'ansible_shell_type' from source: unknown 11792 1727096161.65312: variable 'ansible_shell_executable' from source: unknown 11792 1727096161.65317: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.65320: variable 'ansible_pipelining' from source: unknown 11792 1727096161.65322: variable 'ansible_timeout' from source: unknown 11792 1727096161.65323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.65428: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096161.65517: variable 'omit' from source: magic vars 11792 1727096161.65520: starting attempt loop 11792 1727096161.65523: running the handler 11792 1727096161.65525: handler run complete 11792 1727096161.65527: attempt loop complete, returning result 11792 1727096161.65531: _execute() done 11792 1727096161.65533: dumping result to json 11792 1727096161.65536: done dumping result, returning 11792 1727096161.65538: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-d9c7-3fc0-000000000914] 11792 1727096161.65540: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000914 11792 1727096161.65609: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000914 11792 1727096161.65612: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 11792 1727096161.65663: no more pending results, returning what we have 11792 1727096161.65668: results queue empty 11792 1727096161.65669: checking for any_errors_fatal 11792 1727096161.65675: done checking for any_errors_fatal 11792 1727096161.65675: checking for max_fail_percentage 11792 1727096161.65677: done checking for max_fail_percentage 11792 1727096161.65677: checking to see if all hosts have failed and the running result is not ok 11792 1727096161.65678: done checking to see if all hosts have failed 11792 1727096161.65679: getting the remaining hosts for this loop 11792 1727096161.65680: done getting the remaining hosts for this loop 11792 1727096161.65684: getting the next task for host managed_node2 11792 1727096161.65692: done getting next task for host managed_node2 11792 1727096161.65695: ^ task is: TASK: Setup 11792 1727096161.65698: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096161.65704: getting variables 11792 1727096161.65705: in VariableManager get_vars() 11792 1727096161.65744: Calling all_inventory to load vars for managed_node2 11792 1727096161.65747: Calling groups_inventory to load vars for managed_node2 11792 1727096161.65749: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096161.65763: Calling all_plugins_play to load vars for managed_node2 11792 1727096161.65766: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096161.65875: Calling groups_plugins_play to load vars for managed_node2 11792 1727096161.67384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.69075: done with get_vars() 11792 1727096161.69111: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Monday 23 September 2024 08:56:01 -0400 (0:00:00.058) 0:00:43.971 ****** 11792 1727096161.69223: entering _queue_task() for managed_node2/include_tasks 11792 1727096161.69635: worker is 1 (out of 1 available) 11792 1727096161.69649: exiting _queue_task() for managed_node2/include_tasks 11792 1727096161.69666: done queuing things up, now waiting for results queue to drain 11792 1727096161.69670: waiting for pending results... 11792 1727096161.70093: running TaskExecutor() for managed_node2/TASK: Setup 11792 1727096161.70187: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008ed 11792 1727096161.70192: variable 'ansible_search_path' from source: unknown 11792 1727096161.70195: variable 'ansible_search_path' from source: unknown 11792 1727096161.70213: variable 'lsr_setup' from source: include params 11792 1727096161.70443: variable 'lsr_setup' from source: include params 11792 1727096161.70536: variable 'omit' from source: magic vars 11792 1727096161.70705: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.70728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.70839: variable 'omit' from source: magic vars 11792 1727096161.71016: variable 'ansible_distribution_major_version' from source: facts 11792 1727096161.71031: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096161.71041: variable 'item' from source: unknown 11792 1727096161.71122: variable 'item' from source: unknown 11792 1727096161.71166: variable 'item' from source: unknown 11792 1727096161.71232: variable 'item' from source: unknown 11792 1727096161.71574: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.71578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.71580: variable 'omit' from source: magic vars 11792 1727096161.71699: variable 'ansible_distribution_major_version' from source: facts 11792 1727096161.71702: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096161.71704: variable 'item' from source: unknown 11792 1727096161.71706: variable 'item' from source: unknown 11792 1727096161.71732: variable 'item' from source: unknown 11792 1727096161.71808: variable 'item' from source: unknown 11792 1727096161.72026: dumping result to json 11792 1727096161.72030: done dumping result, returning 11792 1727096161.72033: done running TaskExecutor() for managed_node2/TASK: Setup [0afff68d-5257-d9c7-3fc0-0000000008ed] 11792 1727096161.72035: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ed 11792 1727096161.72081: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ed 11792 1727096161.72084: WORKER PROCESS EXITING 11792 1727096161.72158: no more pending results, returning what we have 11792 1727096161.72164: in VariableManager get_vars() 11792 1727096161.72214: Calling all_inventory to load vars for managed_node2 11792 1727096161.72217: Calling groups_inventory to load vars for managed_node2 11792 1727096161.72220: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096161.72236: Calling all_plugins_play to load vars for managed_node2 11792 1727096161.72240: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096161.72244: Calling groups_plugins_play to load vars for managed_node2 11792 1727096161.73863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.75785: done with get_vars() 11792 1727096161.75810: variable 'ansible_search_path' from source: unknown 11792 1727096161.75812: variable 'ansible_search_path' from source: unknown 11792 1727096161.75866: variable 'ansible_search_path' from source: unknown 11792 1727096161.75871: variable 'ansible_search_path' from source: unknown 11792 1727096161.75903: we have included files to process 11792 1727096161.75905: generating all_blocks data 11792 1727096161.75907: done generating all_blocks data 11792 1727096161.75912: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11792 1727096161.75913: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11792 1727096161.75916: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11792 1727096161.76890: done processing included file 11792 1727096161.76893: iterating over new_blocks loaded from include file 11792 1727096161.76895: in VariableManager get_vars() 11792 1727096161.76917: done with get_vars() 11792 1727096161.76919: filtering new block on tags 11792 1727096161.76985: done filtering new block on tags 11792 1727096161.76988: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node2 => (item=tasks/create_test_interfaces_with_dhcp.yml) 11792 1727096161.76994: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11792 1727096161.76995: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11792 1727096161.76999: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11792 1727096161.77100: in VariableManager get_vars() 11792 1727096161.77123: done with get_vars() 11792 1727096161.77129: variable 'item' from source: include params 11792 1727096161.77243: variable 'item' from source: include params 11792 1727096161.77286: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11792 1727096161.77372: in VariableManager get_vars() 11792 1727096161.77396: done with get_vars() 11792 1727096161.77542: in VariableManager get_vars() 11792 1727096161.77566: done with get_vars() 11792 1727096161.77575: variable 'item' from source: include params 11792 1727096161.77650: variable 'item' from source: include params 11792 1727096161.77685: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11792 1727096161.77871: in VariableManager get_vars() 11792 1727096161.77894: done with get_vars() 11792 1727096161.78008: done processing included file 11792 1727096161.78011: iterating over new_blocks loaded from include file 11792 1727096161.78012: in VariableManager get_vars() 11792 1727096161.78033: done with get_vars() 11792 1727096161.78035: filtering new block on tags 11792 1727096161.78114: done filtering new block on tags 11792 1727096161.78118: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node2 => (item=tasks/assert_dhcp_device_present.yml) 11792 1727096161.78122: extending task lists for all hosts with included blocks 11792 1727096161.78696: done extending task lists 11792 1727096161.78698: done processing included files 11792 1727096161.78699: results queue empty 11792 1727096161.78700: checking for any_errors_fatal 11792 1727096161.78703: done checking for any_errors_fatal 11792 1727096161.78704: checking for max_fail_percentage 11792 1727096161.78705: done checking for max_fail_percentage 11792 1727096161.78706: checking to see if all hosts have failed and the running result is not ok 11792 1727096161.78707: done checking to see if all hosts have failed 11792 1727096161.78714: getting the remaining hosts for this loop 11792 1727096161.78715: done getting the remaining hosts for this loop 11792 1727096161.78718: getting the next task for host managed_node2 11792 1727096161.78722: done getting next task for host managed_node2 11792 1727096161.78724: ^ task is: TASK: Install dnsmasq 11792 1727096161.78727: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096161.78730: getting variables 11792 1727096161.78731: in VariableManager get_vars() 11792 1727096161.78745: Calling all_inventory to load vars for managed_node2 11792 1727096161.78747: Calling groups_inventory to load vars for managed_node2 11792 1727096161.78749: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096161.78759: Calling all_plugins_play to load vars for managed_node2 11792 1727096161.78761: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096161.78764: Calling groups_plugins_play to load vars for managed_node2 11792 1727096161.80040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096161.81744: done with get_vars() 11792 1727096161.81781: done getting variables 11792 1727096161.81824: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:56:01 -0400 (0:00:00.126) 0:00:44.097 ****** 11792 1727096161.81862: entering _queue_task() for managed_node2/package 11792 1727096161.82238: worker is 1 (out of 1 available) 11792 1727096161.82251: exiting _queue_task() for managed_node2/package 11792 1727096161.82272: done queuing things up, now waiting for results queue to drain 11792 1727096161.82274: waiting for pending results... 11792 1727096161.82487: running TaskExecutor() for managed_node2/TASK: Install dnsmasq 11792 1727096161.82675: in run() - task 0afff68d-5257-d9c7-3fc0-000000000974 11792 1727096161.82679: variable 'ansible_search_path' from source: unknown 11792 1727096161.82682: variable 'ansible_search_path' from source: unknown 11792 1727096161.82686: calling self._execute() 11792 1727096161.82758: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.82763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.82805: variable 'omit' from source: magic vars 11792 1727096161.83162: variable 'ansible_distribution_major_version' from source: facts 11792 1727096161.83181: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096161.83188: variable 'omit' from source: magic vars 11792 1727096161.83240: variable 'omit' from source: magic vars 11792 1727096161.83461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096161.86080: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096161.86174: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096161.86266: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096161.86276: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096161.86308: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096161.86423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096161.86460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096161.86503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096161.86588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096161.86593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096161.86873: variable '__network_is_ostree' from source: set_fact 11792 1727096161.86877: variable 'omit' from source: magic vars 11792 1727096161.86879: variable 'omit' from source: magic vars 11792 1727096161.86882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096161.86885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096161.86887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096161.86890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096161.86892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096161.86927: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096161.86937: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.86945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.87070: Set connection var ansible_timeout to 10 11792 1727096161.87086: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096161.87101: Set connection var ansible_shell_executable to /bin/sh 11792 1727096161.87121: Set connection var ansible_pipelining to False 11792 1727096161.87229: Set connection var ansible_shell_type to sh 11792 1727096161.87232: Set connection var ansible_connection to ssh 11792 1727096161.87234: variable 'ansible_shell_executable' from source: unknown 11792 1727096161.87236: variable 'ansible_connection' from source: unknown 11792 1727096161.87238: variable 'ansible_module_compression' from source: unknown 11792 1727096161.87239: variable 'ansible_shell_type' from source: unknown 11792 1727096161.87241: variable 'ansible_shell_executable' from source: unknown 11792 1727096161.87243: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096161.87244: variable 'ansible_pipelining' from source: unknown 11792 1727096161.87246: variable 'ansible_timeout' from source: unknown 11792 1727096161.87248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096161.87335: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096161.87354: variable 'omit' from source: magic vars 11792 1727096161.87364: starting attempt loop 11792 1727096161.87372: running the handler 11792 1727096161.87446: variable 'ansible_facts' from source: unknown 11792 1727096161.87449: variable 'ansible_facts' from source: unknown 11792 1727096161.87451: _low_level_execute_command(): starting 11792 1727096161.87456: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096161.88247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096161.88321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.88363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.88383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.88449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.90232: stdout chunk (state=3): >>>/root <<< 11792 1727096161.90404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096161.90408: stdout chunk (state=3): >>><<< 11792 1727096161.90410: stderr chunk (state=3): >>><<< 11792 1727096161.90431: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096161.90546: _low_level_execute_command(): starting 11792 1727096161.90550: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932 `" && echo ansible-tmp-1727096161.9044545-13899-61680814534932="` echo /root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932 `" ) && sleep 0' 11792 1727096161.91110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.91115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096161.91218: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.91222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096161.91225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.91228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.91301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.93402: stdout chunk (state=3): >>>ansible-tmp-1727096161.9044545-13899-61680814534932=/root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932 <<< 11792 1727096161.93566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096161.93572: stdout chunk (state=3): >>><<< 11792 1727096161.93574: stderr chunk (state=3): >>><<< 11792 1727096161.93674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096161.9044545-13899-61680814534932=/root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096161.93678: variable 'ansible_module_compression' from source: unknown 11792 1727096161.93687: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11792 1727096161.93737: variable 'ansible_facts' from source: unknown 11792 1727096161.93873: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/AnsiballZ_dnf.py 11792 1727096161.94030: Sending initial data 11792 1727096161.94084: Sent initial data (151 bytes) 11792 1727096161.94799: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.94838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096161.94869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.94885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.94961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096161.96708: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096161.96757: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096161.96801: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmplozgnqcw /root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/AnsiballZ_dnf.py <<< 11792 1727096161.96805: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/AnsiballZ_dnf.py" <<< 11792 1727096161.96865: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmplozgnqcw" to remote "/root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/AnsiballZ_dnf.py" <<< 11792 1727096161.97961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096161.98101: stderr chunk (state=3): >>><<< 11792 1727096161.98105: stdout chunk (state=3): >>><<< 11792 1727096161.98107: done transferring module to remote 11792 1727096161.98109: _low_level_execute_command(): starting 11792 1727096161.98112: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/ /root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/AnsiballZ_dnf.py && sleep 0' 11792 1727096161.98679: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096161.98737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.98746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096161.98833: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096161.98853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096161.98876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096161.98903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096161.98987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096162.01088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096162.01328: stdout chunk (state=3): >>><<< 11792 1727096162.01333: stderr chunk (state=3): >>><<< 11792 1727096162.01336: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096162.01339: _low_level_execute_command(): starting 11792 1727096162.01341: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/AnsiballZ_dnf.py && sleep 0' 11792 1727096162.02581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096162.02585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096162.02587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096162.02589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096162.02934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096162.47485: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11792 1727096162.52317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096162.52473: stderr chunk (state=3): >>><<< 11792 1727096162.52477: stdout chunk (state=3): >>><<< 11792 1727096162.52479: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096162.52483: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096162.52490: _low_level_execute_command(): starting 11792 1727096162.52492: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096161.9044545-13899-61680814534932/ > /dev/null 2>&1 && sleep 0' 11792 1727096162.53820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096162.54186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096162.54208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096162.54297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096162.56292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096162.56304: stdout chunk (state=3): >>><<< 11792 1727096162.56316: stderr chunk (state=3): >>><<< 11792 1727096162.56337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096162.56350: handler run complete 11792 1727096162.56510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096162.56700: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096162.56748: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096162.56786: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096162.56820: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096162.56900: variable '__install_status' from source: set_fact 11792 1727096162.56928: Evaluated conditional (__install_status is success): True 11792 1727096162.56950: attempt loop complete, returning result 11792 1727096162.56958: _execute() done 11792 1727096162.56964: dumping result to json 11792 1727096162.56976: done dumping result, returning 11792 1727096162.56988: done running TaskExecutor() for managed_node2/TASK: Install dnsmasq [0afff68d-5257-d9c7-3fc0-000000000974] 11792 1727096162.56996: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000974 ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11792 1727096162.57213: no more pending results, returning what we have 11792 1727096162.57218: results queue empty 11792 1727096162.57219: checking for any_errors_fatal 11792 1727096162.57221: done checking for any_errors_fatal 11792 1727096162.57221: checking for max_fail_percentage 11792 1727096162.57228: done checking for max_fail_percentage 11792 1727096162.57229: checking to see if all hosts have failed and the running result is not ok 11792 1727096162.57230: done checking to see if all hosts have failed 11792 1727096162.57230: getting the remaining hosts for this loop 11792 1727096162.57234: done getting the remaining hosts for this loop 11792 1727096162.57238: getting the next task for host managed_node2 11792 1727096162.57247: done getting next task for host managed_node2 11792 1727096162.57249: ^ task is: TASK: Install pgrep, sysctl 11792 1727096162.57257: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096162.57262: getting variables 11792 1727096162.57264: in VariableManager get_vars() 11792 1727096162.57412: Calling all_inventory to load vars for managed_node2 11792 1727096162.57415: Calling groups_inventory to load vars for managed_node2 11792 1727096162.57418: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096162.57432: Calling all_plugins_play to load vars for managed_node2 11792 1727096162.57435: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096162.57438: Calling groups_plugins_play to load vars for managed_node2 11792 1727096162.58281: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000974 11792 1727096162.58284: WORKER PROCESS EXITING 11792 1727096162.59670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096162.62004: done with get_vars() 11792 1727096162.62044: done getting variables 11792 1727096162.62106: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Monday 23 September 2024 08:56:02 -0400 (0:00:00.803) 0:00:44.901 ****** 11792 1727096162.62255: entering _queue_task() for managed_node2/package 11792 1727096162.63187: worker is 1 (out of 1 available) 11792 1727096162.63197: exiting _queue_task() for managed_node2/package 11792 1727096162.63209: done queuing things up, now waiting for results queue to drain 11792 1727096162.63211: waiting for pending results... 11792 1727096162.63483: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11792 1727096162.63801: in run() - task 0afff68d-5257-d9c7-3fc0-000000000975 11792 1727096162.63825: variable 'ansible_search_path' from source: unknown 11792 1727096162.63835: variable 'ansible_search_path' from source: unknown 11792 1727096162.63886: calling self._execute() 11792 1727096162.64176: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096162.64190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096162.64205: variable 'omit' from source: magic vars 11792 1727096162.65173: variable 'ansible_distribution_major_version' from source: facts 11792 1727096162.65178: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096162.65181: variable 'ansible_os_family' from source: facts 11792 1727096162.65184: Evaluated conditional (ansible_os_family == 'RedHat'): True 11792 1727096162.65508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096162.66155: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096162.66208: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096162.66410: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096162.66449: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096162.66544: variable 'ansible_distribution_major_version' from source: facts 11792 1727096162.66973: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11792 1727096162.66977: when evaluation is False, skipping this task 11792 1727096162.66980: _execute() done 11792 1727096162.66982: dumping result to json 11792 1727096162.66984: done dumping result, returning 11792 1727096162.66987: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [0afff68d-5257-d9c7-3fc0-000000000975] 11792 1727096162.66989: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000975 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11792 1727096162.67119: no more pending results, returning what we have 11792 1727096162.67124: results queue empty 11792 1727096162.67125: checking for any_errors_fatal 11792 1727096162.67135: done checking for any_errors_fatal 11792 1727096162.67135: checking for max_fail_percentage 11792 1727096162.67137: done checking for max_fail_percentage 11792 1727096162.67143: checking to see if all hosts have failed and the running result is not ok 11792 1727096162.67144: done checking to see if all hosts have failed 11792 1727096162.67144: getting the remaining hosts for this loop 11792 1727096162.67146: done getting the remaining hosts for this loop 11792 1727096162.67150: getting the next task for host managed_node2 11792 1727096162.67157: done getting next task for host managed_node2 11792 1727096162.67160: ^ task is: TASK: Install pgrep, sysctl 11792 1727096162.67165: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096162.67172: getting variables 11792 1727096162.67173: in VariableManager get_vars() 11792 1727096162.67213: Calling all_inventory to load vars for managed_node2 11792 1727096162.67215: Calling groups_inventory to load vars for managed_node2 11792 1727096162.67218: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096162.67232: Calling all_plugins_play to load vars for managed_node2 11792 1727096162.67234: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096162.67238: Calling groups_plugins_play to load vars for managed_node2 11792 1727096162.67786: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000975 11792 1727096162.67792: WORKER PROCESS EXITING 11792 1727096162.70039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096162.72613: done with get_vars() 11792 1727096162.72642: done getting variables 11792 1727096162.72712: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Monday 23 September 2024 08:56:02 -0400 (0:00:00.104) 0:00:45.006 ****** 11792 1727096162.72746: entering _queue_task() for managed_node2/package 11792 1727096162.73200: worker is 1 (out of 1 available) 11792 1727096162.73212: exiting _queue_task() for managed_node2/package 11792 1727096162.73225: done queuing things up, now waiting for results queue to drain 11792 1727096162.73227: waiting for pending results... 11792 1727096162.73475: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11792 1727096162.73588: in run() - task 0afff68d-5257-d9c7-3fc0-000000000976 11792 1727096162.73608: variable 'ansible_search_path' from source: unknown 11792 1727096162.73614: variable 'ansible_search_path' from source: unknown 11792 1727096162.73656: calling self._execute() 11792 1727096162.73973: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096162.73977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096162.73980: variable 'omit' from source: magic vars 11792 1727096162.74148: variable 'ansible_distribution_major_version' from source: facts 11792 1727096162.74160: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096162.74272: variable 'ansible_os_family' from source: facts 11792 1727096162.74279: Evaluated conditional (ansible_os_family == 'RedHat'): True 11792 1727096162.74441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096162.74725: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096162.74769: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096162.74803: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096162.74843: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096162.74914: variable 'ansible_distribution_major_version' from source: facts 11792 1727096162.74927: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11792 1727096162.74939: variable 'omit' from source: magic vars 11792 1727096162.74983: variable 'omit' from source: magic vars 11792 1727096162.75372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096162.77459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096162.77525: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096162.77570: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096162.77604: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096162.77630: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096162.77725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096162.77756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096162.77788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096162.77827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096162.77843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096162.77944: variable '__network_is_ostree' from source: set_fact 11792 1727096162.77947: variable 'omit' from source: magic vars 11792 1727096162.77986: variable 'omit' from source: magic vars 11792 1727096162.78012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096162.78039: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096162.78058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096162.78075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096162.78093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096162.78122: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096162.78126: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096162.78128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096162.78233: Set connection var ansible_timeout to 10 11792 1727096162.78242: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096162.78251: Set connection var ansible_shell_executable to /bin/sh 11792 1727096162.78256: Set connection var ansible_pipelining to False 11792 1727096162.78259: Set connection var ansible_shell_type to sh 11792 1727096162.78261: Set connection var ansible_connection to ssh 11792 1727096162.78286: variable 'ansible_shell_executable' from source: unknown 11792 1727096162.78289: variable 'ansible_connection' from source: unknown 11792 1727096162.78291: variable 'ansible_module_compression' from source: unknown 11792 1727096162.78293: variable 'ansible_shell_type' from source: unknown 11792 1727096162.78306: variable 'ansible_shell_executable' from source: unknown 11792 1727096162.78309: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096162.78314: variable 'ansible_pipelining' from source: unknown 11792 1727096162.78317: variable 'ansible_timeout' from source: unknown 11792 1727096162.78321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096162.78427: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096162.78434: variable 'omit' from source: magic vars 11792 1727096162.78439: starting attempt loop 11792 1727096162.78443: running the handler 11792 1727096162.78449: variable 'ansible_facts' from source: unknown 11792 1727096162.78454: variable 'ansible_facts' from source: unknown 11792 1727096162.78489: _low_level_execute_command(): starting 11792 1727096162.78495: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096162.79273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096162.79283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096162.79327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096162.79345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096162.79411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096162.81108: stdout chunk (state=3): >>>/root <<< 11792 1727096162.81278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096162.81282: stdout chunk (state=3): >>><<< 11792 1727096162.81284: stderr chunk (state=3): >>><<< 11792 1727096162.81305: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096162.81323: _low_level_execute_command(): starting 11792 1727096162.81334: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458 `" && echo ansible-tmp-1727096162.8131206-13958-183140747309458="` echo /root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458 `" ) && sleep 0' 11792 1727096162.81979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096162.81994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096162.82010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096162.82288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096162.82310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096162.82408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096162.84331: stdout chunk (state=3): >>>ansible-tmp-1727096162.8131206-13958-183140747309458=/root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458 <<< 11792 1727096162.84481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096162.84491: stdout chunk (state=3): >>><<< 11792 1727096162.84502: stderr chunk (state=3): >>><<< 11792 1727096162.84778: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096162.8131206-13958-183140747309458=/root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096162.84782: variable 'ansible_module_compression' from source: unknown 11792 1727096162.84784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11792 1727096162.84786: variable 'ansible_facts' from source: unknown 11792 1727096162.84927: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/AnsiballZ_dnf.py 11792 1727096162.85087: Sending initial data 11792 1727096162.85096: Sent initial data (152 bytes) 11792 1727096162.85677: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096162.85777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096162.85785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096162.85856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096162.87443: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096162.87497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096162.87528: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp2unbmybp /root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/AnsiballZ_dnf.py <<< 11792 1727096162.87532: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/AnsiballZ_dnf.py" <<< 11792 1727096162.87592: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp2unbmybp" to remote "/root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/AnsiballZ_dnf.py" <<< 11792 1727096162.88495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096162.88535: stderr chunk (state=3): >>><<< 11792 1727096162.88545: stdout chunk (state=3): >>><<< 11792 1727096162.88661: done transferring module to remote 11792 1727096162.88665: _low_level_execute_command(): starting 11792 1727096162.88674: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/ /root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/AnsiballZ_dnf.py && sleep 0' 11792 1727096162.89264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096162.89275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096162.89335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096162.91375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096162.91380: stdout chunk (state=3): >>><<< 11792 1727096162.91382: stderr chunk (state=3): >>><<< 11792 1727096162.91575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096162.91579: _low_level_execute_command(): starting 11792 1727096162.91583: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/AnsiballZ_dnf.py && sleep 0' 11792 1727096162.92065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096162.92076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096162.92088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096162.92102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096162.92115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096162.92122: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096162.92139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096162.92154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096162.92188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096162.92266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096162.92314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096162.92360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096163.35226: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11792 1727096163.40174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096163.40178: stdout chunk (state=3): >>><<< 11792 1727096163.40181: stderr chunk (state=3): >>><<< 11792 1727096163.40199: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096163.40363: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096163.40372: _low_level_execute_command(): starting 11792 1727096163.40376: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096162.8131206-13958-183140747309458/ > /dev/null 2>&1 && sleep 0' 11792 1727096163.40951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096163.40966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096163.40983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096163.41003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096163.41021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096163.41036: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096163.41136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096163.41147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096163.41163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096163.41229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096163.43475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096163.43480: stdout chunk (state=3): >>><<< 11792 1727096163.43482: stderr chunk (state=3): >>><<< 11792 1727096163.43485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096163.43488: handler run complete 11792 1727096163.43490: attempt loop complete, returning result 11792 1727096163.43492: _execute() done 11792 1727096163.43494: dumping result to json 11792 1727096163.43496: done dumping result, returning 11792 1727096163.43498: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [0afff68d-5257-d9c7-3fc0-000000000976] 11792 1727096163.43500: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000976 ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11792 1727096163.43651: no more pending results, returning what we have 11792 1727096163.43657: results queue empty 11792 1727096163.43658: checking for any_errors_fatal 11792 1727096163.43666: done checking for any_errors_fatal 11792 1727096163.43722: checking for max_fail_percentage 11792 1727096163.43726: done checking for max_fail_percentage 11792 1727096163.43727: checking to see if all hosts have failed and the running result is not ok 11792 1727096163.43728: done checking to see if all hosts have failed 11792 1727096163.43729: getting the remaining hosts for this loop 11792 1727096163.43730: done getting the remaining hosts for this loop 11792 1727096163.43734: getting the next task for host managed_node2 11792 1727096163.43742: done getting next task for host managed_node2 11792 1727096163.43744: ^ task is: TASK: Create test interfaces 11792 1727096163.43748: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096163.43756: getting variables 11792 1727096163.43757: in VariableManager get_vars() 11792 1727096163.43992: Calling all_inventory to load vars for managed_node2 11792 1727096163.43995: Calling groups_inventory to load vars for managed_node2 11792 1727096163.43998: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096163.44026: Calling all_plugins_play to load vars for managed_node2 11792 1727096163.44031: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096163.44038: Calling groups_plugins_play to load vars for managed_node2 11792 1727096163.44673: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000976 11792 1727096163.44677: WORKER PROCESS EXITING 11792 1727096163.47209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096163.50691: done with get_vars() 11792 1727096163.50838: done getting variables 11792 1727096163.50906: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Monday 23 September 2024 08:56:03 -0400 (0:00:00.782) 0:00:45.789 ****** 11792 1727096163.50992: entering _queue_task() for managed_node2/shell 11792 1727096163.51710: worker is 1 (out of 1 available) 11792 1727096163.51725: exiting _queue_task() for managed_node2/shell 11792 1727096163.51738: done queuing things up, now waiting for results queue to drain 11792 1727096163.51740: waiting for pending results... 11792 1727096163.52397: running TaskExecutor() for managed_node2/TASK: Create test interfaces 11792 1727096163.52851: in run() - task 0afff68d-5257-d9c7-3fc0-000000000977 11792 1727096163.52859: variable 'ansible_search_path' from source: unknown 11792 1727096163.52863: variable 'ansible_search_path' from source: unknown 11792 1727096163.52869: calling self._execute() 11792 1727096163.53072: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096163.53087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096163.53103: variable 'omit' from source: magic vars 11792 1727096163.54066: variable 'ansible_distribution_major_version' from source: facts 11792 1727096163.54096: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096163.54264: variable 'omit' from source: magic vars 11792 1727096163.54269: variable 'omit' from source: magic vars 11792 1727096163.55245: variable 'dhcp_interface1' from source: play vars 11792 1727096163.55249: variable 'dhcp_interface2' from source: play vars 11792 1727096163.55254: variable 'omit' from source: magic vars 11792 1727096163.55355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096163.55447: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096163.55572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096163.55577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096163.55592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096163.55690: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096163.55755: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096163.55764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096163.56003: Set connection var ansible_timeout to 10 11792 1727096163.56019: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096163.56032: Set connection var ansible_shell_executable to /bin/sh 11792 1727096163.56108: Set connection var ansible_pipelining to False 11792 1727096163.56111: Set connection var ansible_shell_type to sh 11792 1727096163.56114: Set connection var ansible_connection to ssh 11792 1727096163.56216: variable 'ansible_shell_executable' from source: unknown 11792 1727096163.56220: variable 'ansible_connection' from source: unknown 11792 1727096163.56222: variable 'ansible_module_compression' from source: unknown 11792 1727096163.56224: variable 'ansible_shell_type' from source: unknown 11792 1727096163.56226: variable 'ansible_shell_executable' from source: unknown 11792 1727096163.56229: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096163.56232: variable 'ansible_pipelining' from source: unknown 11792 1727096163.56234: variable 'ansible_timeout' from source: unknown 11792 1727096163.56236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096163.56564: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096163.56587: variable 'omit' from source: magic vars 11792 1727096163.56598: starting attempt loop 11792 1727096163.56606: running the handler 11792 1727096163.56621: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096163.56648: _low_level_execute_command(): starting 11792 1727096163.56683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096163.57959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096163.57978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096163.58043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096163.58113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096163.58144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096163.58271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096163.59970: stdout chunk (state=3): >>>/root <<< 11792 1727096163.60212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096163.60216: stdout chunk (state=3): >>><<< 11792 1727096163.60218: stderr chunk (state=3): >>><<< 11792 1727096163.60376: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096163.60381: _low_level_execute_command(): starting 11792 1727096163.60384: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651 `" && echo ansible-tmp-1727096163.602428-13995-29303630941651="` echo /root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651 `" ) && sleep 0' 11792 1727096163.61689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096163.61705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096163.61747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096163.61820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096163.61834: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096163.61972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096163.62127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096163.62141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096163.64144: stdout chunk (state=3): >>>ansible-tmp-1727096163.602428-13995-29303630941651=/root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651 <<< 11792 1727096163.64463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096163.64470: stdout chunk (state=3): >>><<< 11792 1727096163.64473: stderr chunk (state=3): >>><<< 11792 1727096163.64476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096163.602428-13995-29303630941651=/root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096163.64479: variable 'ansible_module_compression' from source: unknown 11792 1727096163.64677: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096163.64680: variable 'ansible_facts' from source: unknown 11792 1727096163.64856: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/AnsiballZ_command.py 11792 1727096163.65202: Sending initial data 11792 1727096163.65211: Sent initial data (154 bytes) 11792 1727096163.66504: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096163.66614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096163.66917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096163.66938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096163.67016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096163.68700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096163.68741: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096163.68884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/AnsiballZ_command.py" <<< 11792 1727096163.68887: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp1k3kj2dq /root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/AnsiballZ_command.py <<< 11792 1727096163.68891: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp1k3kj2dq" to remote "/root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/AnsiballZ_command.py" <<< 11792 1727096163.70475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096163.70486: stdout chunk (state=3): >>><<< 11792 1727096163.70787: stderr chunk (state=3): >>><<< 11792 1727096163.70791: done transferring module to remote 11792 1727096163.70794: _low_level_execute_command(): starting 11792 1727096163.70796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/ /root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/AnsiballZ_command.py && sleep 0' 11792 1727096163.71891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096163.72087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096163.72291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096163.72346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096163.74530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096163.74534: stdout chunk (state=3): >>><<< 11792 1727096163.74536: stderr chunk (state=3): >>><<< 11792 1727096163.74766: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096163.74773: _low_level_execute_command(): starting 11792 1727096163.74776: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/AnsiballZ_command.py && sleep 0' 11792 1727096163.75978: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096163.76095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096163.76109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096163.76188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096165.14796: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6933 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6933 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-23 08:56:03.918420", "end": "2024-09-23 08:56:05.146069", "delta": "0:00:01.227649", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096165.16534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096165.16539: stdout chunk (state=3): >>><<< 11792 1727096165.16542: stderr chunk (state=3): >>><<< 11792 1727096165.16677: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6933 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6933 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-23 08:56:03.918420", "end": "2024-09-23 08:56:05.146069", "delta": "0:00:01.227649", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096165.16688: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096165.16691: _low_level_execute_command(): starting 11792 1727096165.16694: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096163.602428-13995-29303630941651/ > /dev/null 2>&1 && sleep 0' 11792 1727096165.18080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096165.18106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096165.18210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096165.18399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096165.18438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096165.18452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096165.20399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096165.20410: stdout chunk (state=3): >>><<< 11792 1727096165.20421: stderr chunk (state=3): >>><<< 11792 1727096165.20443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096165.20455: handler run complete 11792 1727096165.20486: Evaluated conditional (False): False 11792 1727096165.20500: attempt loop complete, returning result 11792 1727096165.20507: _execute() done 11792 1727096165.20513: dumping result to json 11792 1727096165.20522: done dumping result, returning 11792 1727096165.20533: done running TaskExecutor() for managed_node2/TASK: Create test interfaces [0afff68d-5257-d9c7-3fc0-000000000977] 11792 1727096165.20541: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000977 11792 1727096165.20676: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000977 ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.227649", "end": "2024-09-23 08:56:05.146069", "rc": 0, "start": "2024-09-23 08:56:03.918420" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 6933 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 6933 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11792 1727096165.20758: no more pending results, returning what we have 11792 1727096165.20761: results queue empty 11792 1727096165.20762: checking for any_errors_fatal 11792 1727096165.20774: done checking for any_errors_fatal 11792 1727096165.20775: checking for max_fail_percentage 11792 1727096165.20777: done checking for max_fail_percentage 11792 1727096165.20778: checking to see if all hosts have failed and the running result is not ok 11792 1727096165.20779: done checking to see if all hosts have failed 11792 1727096165.20779: getting the remaining hosts for this loop 11792 1727096165.20781: done getting the remaining hosts for this loop 11792 1727096165.20870: getting the next task for host managed_node2 11792 1727096165.20881: done getting next task for host managed_node2 11792 1727096165.20885: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11792 1727096165.20889: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096165.20898: getting variables 11792 1727096165.20900: in VariableManager get_vars() 11792 1727096165.20934: Calling all_inventory to load vars for managed_node2 11792 1727096165.20937: Calling groups_inventory to load vars for managed_node2 11792 1727096165.20939: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096165.20947: WORKER PROCESS EXITING 11792 1727096165.21016: Calling all_plugins_play to load vars for managed_node2 11792 1727096165.21019: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096165.21023: Calling groups_plugins_play to load vars for managed_node2 11792 1727096165.22538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096165.31578: done with get_vars() 11792 1727096165.31608: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:56:05 -0400 (0:00:01.807) 0:00:47.596 ****** 11792 1727096165.31712: entering _queue_task() for managed_node2/include_tasks 11792 1727096165.32217: worker is 1 (out of 1 available) 11792 1727096165.32227: exiting _queue_task() for managed_node2/include_tasks 11792 1727096165.32243: done queuing things up, now waiting for results queue to drain 11792 1727096165.32245: waiting for pending results... 11792 1727096165.32506: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11792 1727096165.32650: in run() - task 0afff68d-5257-d9c7-3fc0-00000000097e 11792 1727096165.32669: variable 'ansible_search_path' from source: unknown 11792 1727096165.32673: variable 'ansible_search_path' from source: unknown 11792 1727096165.32749: calling self._execute() 11792 1727096165.33174: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096165.33179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096165.33182: variable 'omit' from source: magic vars 11792 1727096165.33293: variable 'ansible_distribution_major_version' from source: facts 11792 1727096165.33308: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096165.33313: _execute() done 11792 1727096165.33316: dumping result to json 11792 1727096165.33318: done dumping result, returning 11792 1727096165.33324: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-d9c7-3fc0-00000000097e] 11792 1727096165.33327: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000097e 11792 1727096165.33441: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000097e 11792 1727096165.33443: WORKER PROCESS EXITING 11792 1727096165.33482: no more pending results, returning what we have 11792 1727096165.33492: in VariableManager get_vars() 11792 1727096165.33541: Calling all_inventory to load vars for managed_node2 11792 1727096165.33545: Calling groups_inventory to load vars for managed_node2 11792 1727096165.33547: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096165.33566: Calling all_plugins_play to load vars for managed_node2 11792 1727096165.33571: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096165.33575: Calling groups_plugins_play to load vars for managed_node2 11792 1727096165.35551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096165.37297: done with get_vars() 11792 1727096165.37323: variable 'ansible_search_path' from source: unknown 11792 1727096165.37325: variable 'ansible_search_path' from source: unknown 11792 1727096165.37381: we have included files to process 11792 1727096165.37383: generating all_blocks data 11792 1727096165.37385: done generating all_blocks data 11792 1727096165.37391: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096165.37392: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096165.37395: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096165.37610: done processing included file 11792 1727096165.37612: iterating over new_blocks loaded from include file 11792 1727096165.37614: in VariableManager get_vars() 11792 1727096165.37634: done with get_vars() 11792 1727096165.37636: filtering new block on tags 11792 1727096165.37682: done filtering new block on tags 11792 1727096165.37689: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11792 1727096165.37695: extending task lists for all hosts with included blocks 11792 1727096165.38071: done extending task lists 11792 1727096165.38072: done processing included files 11792 1727096165.38073: results queue empty 11792 1727096165.38074: checking for any_errors_fatal 11792 1727096165.38083: done checking for any_errors_fatal 11792 1727096165.38084: checking for max_fail_percentage 11792 1727096165.38085: done checking for max_fail_percentage 11792 1727096165.38086: checking to see if all hosts have failed and the running result is not ok 11792 1727096165.38087: done checking to see if all hosts have failed 11792 1727096165.38087: getting the remaining hosts for this loop 11792 1727096165.38088: done getting the remaining hosts for this loop 11792 1727096165.38091: getting the next task for host managed_node2 11792 1727096165.38096: done getting next task for host managed_node2 11792 1727096165.38098: ^ task is: TASK: Get stat for interface {{ interface }} 11792 1727096165.38103: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096165.38105: getting variables 11792 1727096165.38106: in VariableManager get_vars() 11792 1727096165.38141: Calling all_inventory to load vars for managed_node2 11792 1727096165.38144: Calling groups_inventory to load vars for managed_node2 11792 1727096165.38146: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096165.38152: Calling all_plugins_play to load vars for managed_node2 11792 1727096165.38157: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096165.38161: Calling groups_plugins_play to load vars for managed_node2 11792 1727096165.39551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096165.41239: done with get_vars() 11792 1727096165.41277: done getting variables 11792 1727096165.41474: variable 'interface' from source: task vars 11792 1727096165.41478: variable 'dhcp_interface1' from source: play vars 11792 1727096165.41543: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:56:05 -0400 (0:00:00.098) 0:00:47.695 ****** 11792 1727096165.41582: entering _queue_task() for managed_node2/stat 11792 1727096165.42002: worker is 1 (out of 1 available) 11792 1727096165.42015: exiting _queue_task() for managed_node2/stat 11792 1727096165.42028: done queuing things up, now waiting for results queue to drain 11792 1727096165.42030: waiting for pending results... 11792 1727096165.42243: running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 11792 1727096165.42340: in run() - task 0afff68d-5257-d9c7-3fc0-0000000009dd 11792 1727096165.42354: variable 'ansible_search_path' from source: unknown 11792 1727096165.42360: variable 'ansible_search_path' from source: unknown 11792 1727096165.42389: calling self._execute() 11792 1727096165.42466: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096165.42475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096165.42483: variable 'omit' from source: magic vars 11792 1727096165.42762: variable 'ansible_distribution_major_version' from source: facts 11792 1727096165.42774: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096165.42781: variable 'omit' from source: magic vars 11792 1727096165.42823: variable 'omit' from source: magic vars 11792 1727096165.42895: variable 'interface' from source: task vars 11792 1727096165.42900: variable 'dhcp_interface1' from source: play vars 11792 1727096165.42948: variable 'dhcp_interface1' from source: play vars 11792 1727096165.42968: variable 'omit' from source: magic vars 11792 1727096165.43003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096165.43033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096165.43050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096165.43071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096165.43075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096165.43099: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096165.43103: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096165.43106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096165.43177: Set connection var ansible_timeout to 10 11792 1727096165.43185: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096165.43194: Set connection var ansible_shell_executable to /bin/sh 11792 1727096165.43198: Set connection var ansible_pipelining to False 11792 1727096165.43201: Set connection var ansible_shell_type to sh 11792 1727096165.43203: Set connection var ansible_connection to ssh 11792 1727096165.43220: variable 'ansible_shell_executable' from source: unknown 11792 1727096165.43223: variable 'ansible_connection' from source: unknown 11792 1727096165.43226: variable 'ansible_module_compression' from source: unknown 11792 1727096165.43230: variable 'ansible_shell_type' from source: unknown 11792 1727096165.43233: variable 'ansible_shell_executable' from source: unknown 11792 1727096165.43235: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096165.43237: variable 'ansible_pipelining' from source: unknown 11792 1727096165.43240: variable 'ansible_timeout' from source: unknown 11792 1727096165.43243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096165.43396: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096165.43407: variable 'omit' from source: magic vars 11792 1727096165.43413: starting attempt loop 11792 1727096165.43415: running the handler 11792 1727096165.43427: _low_level_execute_command(): starting 11792 1727096165.43434: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096165.44151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096165.44182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096165.44200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096165.44271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096165.46021: stdout chunk (state=3): >>>/root <<< 11792 1727096165.46144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096165.46155: stdout chunk (state=3): >>><<< 11792 1727096165.46162: stderr chunk (state=3): >>><<< 11792 1727096165.46255: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096165.46261: _low_level_execute_command(): starting 11792 1727096165.46263: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815 `" && echo ansible-tmp-1727096165.461853-14047-45168253310815="` echo /root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815 `" ) && sleep 0' 11792 1727096165.46701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096165.46704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096165.46708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096165.46718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096165.46721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096165.46791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096165.46817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096165.46860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096165.48858: stdout chunk (state=3): >>>ansible-tmp-1727096165.461853-14047-45168253310815=/root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815 <<< 11792 1727096165.49013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096165.49017: stdout chunk (state=3): >>><<< 11792 1727096165.49019: stderr chunk (state=3): >>><<< 11792 1727096165.49040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096165.461853-14047-45168253310815=/root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096165.49095: variable 'ansible_module_compression' from source: unknown 11792 1727096165.49373: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11792 1727096165.49376: variable 'ansible_facts' from source: unknown 11792 1727096165.49378: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/AnsiballZ_stat.py 11792 1727096165.49525: Sending initial data 11792 1727096165.49528: Sent initial data (151 bytes) 11792 1727096165.50255: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096165.50382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096165.50427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096165.50456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096165.52170: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096165.52282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/AnsiballZ_stat.py" <<< 11792 1727096165.52376: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp1udp840m /root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/AnsiballZ_stat.py <<< 11792 1727096165.52476: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp1udp840m" to remote "/root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/AnsiballZ_stat.py" <<< 11792 1727096165.53638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096165.53824: stderr chunk (state=3): >>><<< 11792 1727096165.53827: stdout chunk (state=3): >>><<< 11792 1727096165.53831: done transferring module to remote 11792 1727096165.53833: _low_level_execute_command(): starting 11792 1727096165.53835: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/ /root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/AnsiballZ_stat.py && sleep 0' 11792 1727096165.54813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096165.55041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096165.55187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096165.55478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096165.55621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096165.55693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096165.57639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096165.57735: stderr chunk (state=3): >>><<< 11792 1727096165.57750: stdout chunk (state=3): >>><<< 11792 1727096165.57779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096165.57854: _low_level_execute_command(): starting 11792 1727096165.57857: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/AnsiballZ_stat.py && sleep 0' 11792 1727096165.59089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096165.59100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096165.59116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096165.59147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096165.59251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096165.75500: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27781, "dev": 23, "nlink": 1, "atime": 1727096163.9250665, "mtime": 1727096163.9250665, "ctime": 1727096163.9250665, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11792 1727096165.76756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096165.76817: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 11792 1727096165.76833: stdout chunk (state=3): >>><<< 11792 1727096165.76845: stderr chunk (state=3): >>><<< 11792 1727096165.76936: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27781, "dev": 23, "nlink": 1, "atime": 1727096163.9250665, "mtime": 1727096163.9250665, "ctime": 1727096163.9250665, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096165.77192: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096165.77195: _low_level_execute_command(): starting 11792 1727096165.77197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096165.461853-14047-45168253310815/ > /dev/null 2>&1 && sleep 0' 11792 1727096165.78602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096165.78692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096165.78741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096165.78762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096165.78879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096165.80802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096165.80979: stderr chunk (state=3): >>><<< 11792 1727096165.81001: stdout chunk (state=3): >>><<< 11792 1727096165.81037: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096165.81237: handler run complete 11792 1727096165.81241: attempt loop complete, returning result 11792 1727096165.81243: _execute() done 11792 1727096165.81245: dumping result to json 11792 1727096165.81247: done dumping result, returning 11792 1727096165.81249: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 [0afff68d-5257-d9c7-3fc0-0000000009dd] 11792 1727096165.81250: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000009dd 11792 1727096165.81607: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000009dd 11792 1727096165.81610: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096163.9250665, "block_size": 4096, "blocks": 0, "ctime": 1727096163.9250665, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27781, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727096163.9250665, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11792 1727096165.81714: no more pending results, returning what we have 11792 1727096165.81732: results queue empty 11792 1727096165.81734: checking for any_errors_fatal 11792 1727096165.81736: done checking for any_errors_fatal 11792 1727096165.81736: checking for max_fail_percentage 11792 1727096165.81742: done checking for max_fail_percentage 11792 1727096165.81743: checking to see if all hosts have failed and the running result is not ok 11792 1727096165.81744: done checking to see if all hosts have failed 11792 1727096165.81745: getting the remaining hosts for this loop 11792 1727096165.81746: done getting the remaining hosts for this loop 11792 1727096165.81751: getting the next task for host managed_node2 11792 1727096165.81764: done getting next task for host managed_node2 11792 1727096165.81769: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11792 1727096165.81773: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096165.81777: getting variables 11792 1727096165.81779: in VariableManager get_vars() 11792 1727096165.81818: Calling all_inventory to load vars for managed_node2 11792 1727096165.81821: Calling groups_inventory to load vars for managed_node2 11792 1727096165.81823: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096165.81988: Calling all_plugins_play to load vars for managed_node2 11792 1727096165.81991: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096165.81995: Calling groups_plugins_play to load vars for managed_node2 11792 1727096165.84856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096165.86702: done with get_vars() 11792 1727096165.86745: done getting variables 11792 1727096165.86808: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096165.87229: variable 'interface' from source: task vars 11792 1727096165.87233: variable 'dhcp_interface1' from source: play vars 11792 1727096165.87427: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:56:05 -0400 (0:00:00.459) 0:00:48.154 ****** 11792 1727096165.87564: entering _queue_task() for managed_node2/assert 11792 1727096165.88402: worker is 1 (out of 1 available) 11792 1727096165.88414: exiting _queue_task() for managed_node2/assert 11792 1727096165.88427: done queuing things up, now waiting for results queue to drain 11792 1727096165.88429: waiting for pending results... 11792 1727096165.88872: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' 11792 1727096165.88932: in run() - task 0afff68d-5257-d9c7-3fc0-00000000097f 11792 1727096165.88956: variable 'ansible_search_path' from source: unknown 11792 1727096165.88965: variable 'ansible_search_path' from source: unknown 11792 1727096165.89023: calling self._execute() 11792 1727096165.89146: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096165.89161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096165.89179: variable 'omit' from source: magic vars 11792 1727096165.89676: variable 'ansible_distribution_major_version' from source: facts 11792 1727096165.89698: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096165.89751: variable 'omit' from source: magic vars 11792 1727096165.89792: variable 'omit' from source: magic vars 11792 1727096165.89951: variable 'interface' from source: task vars 11792 1727096165.90018: variable 'dhcp_interface1' from source: play vars 11792 1727096165.90094: variable 'dhcp_interface1' from source: play vars 11792 1727096165.90171: variable 'omit' from source: magic vars 11792 1727096165.90202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096165.90255: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096165.90282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096165.90310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096165.90334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096165.90372: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096165.90412: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096165.90416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096165.90525: Set connection var ansible_timeout to 10 11792 1727096165.90551: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096165.90573: Set connection var ansible_shell_executable to /bin/sh 11792 1727096165.90631: Set connection var ansible_pipelining to False 11792 1727096165.90634: Set connection var ansible_shell_type to sh 11792 1727096165.90641: Set connection var ansible_connection to ssh 11792 1727096165.90644: variable 'ansible_shell_executable' from source: unknown 11792 1727096165.90647: variable 'ansible_connection' from source: unknown 11792 1727096165.90649: variable 'ansible_module_compression' from source: unknown 11792 1727096165.90658: variable 'ansible_shell_type' from source: unknown 11792 1727096165.90665: variable 'ansible_shell_executable' from source: unknown 11792 1727096165.90675: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096165.90686: variable 'ansible_pipelining' from source: unknown 11792 1727096165.90740: variable 'ansible_timeout' from source: unknown 11792 1727096165.90743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096165.90883: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096165.90902: variable 'omit' from source: magic vars 11792 1727096165.90915: starting attempt loop 11792 1727096165.90922: running the handler 11792 1727096165.91096: variable 'interface_stat' from source: set_fact 11792 1727096165.91123: Evaluated conditional (interface_stat.stat.exists): True 11792 1727096165.91201: handler run complete 11792 1727096165.91204: attempt loop complete, returning result 11792 1727096165.91207: _execute() done 11792 1727096165.91209: dumping result to json 11792 1727096165.91211: done dumping result, returning 11792 1727096165.91213: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' [0afff68d-5257-d9c7-3fc0-00000000097f] 11792 1727096165.91215: sending task result for task 0afff68d-5257-d9c7-3fc0-00000000097f ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096165.91445: no more pending results, returning what we have 11792 1727096165.91449: results queue empty 11792 1727096165.91450: checking for any_errors_fatal 11792 1727096165.91464: done checking for any_errors_fatal 11792 1727096165.91465: checking for max_fail_percentage 11792 1727096165.91468: done checking for max_fail_percentage 11792 1727096165.91469: checking to see if all hosts have failed and the running result is not ok 11792 1727096165.91470: done checking to see if all hosts have failed 11792 1727096165.91471: getting the remaining hosts for this loop 11792 1727096165.91473: done getting the remaining hosts for this loop 11792 1727096165.91591: getting the next task for host managed_node2 11792 1727096165.91603: done getting next task for host managed_node2 11792 1727096165.91606: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11792 1727096165.91612: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096165.91617: getting variables 11792 1727096165.91618: in VariableManager get_vars() 11792 1727096165.91663: Calling all_inventory to load vars for managed_node2 11792 1727096165.91666: Calling groups_inventory to load vars for managed_node2 11792 1727096165.91701: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096165.91709: done sending task result for task 0afff68d-5257-d9c7-3fc0-00000000097f 11792 1727096165.91716: WORKER PROCESS EXITING 11792 1727096165.91728: Calling all_plugins_play to load vars for managed_node2 11792 1727096165.91731: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096165.91734: Calling groups_plugins_play to load vars for managed_node2 11792 1727096165.93807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096165.95686: done with get_vars() 11792 1727096165.95715: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:56:05 -0400 (0:00:00.082) 0:00:48.237 ****** 11792 1727096165.95830: entering _queue_task() for managed_node2/include_tasks 11792 1727096165.96203: worker is 1 (out of 1 available) 11792 1727096165.96216: exiting _queue_task() for managed_node2/include_tasks 11792 1727096165.96340: done queuing things up, now waiting for results queue to drain 11792 1727096165.96342: waiting for pending results... 11792 1727096165.96582: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11792 1727096165.96668: in run() - task 0afff68d-5257-d9c7-3fc0-000000000983 11792 1727096165.96682: variable 'ansible_search_path' from source: unknown 11792 1727096165.96685: variable 'ansible_search_path' from source: unknown 11792 1727096165.96716: calling self._execute() 11792 1727096165.96799: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096165.96806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096165.96815: variable 'omit' from source: magic vars 11792 1727096165.97100: variable 'ansible_distribution_major_version' from source: facts 11792 1727096165.97109: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096165.97115: _execute() done 11792 1727096165.97118: dumping result to json 11792 1727096165.97121: done dumping result, returning 11792 1727096165.97129: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-d9c7-3fc0-000000000983] 11792 1727096165.97132: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000983 11792 1727096165.97220: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000983 11792 1727096165.97223: WORKER PROCESS EXITING 11792 1727096165.97266: no more pending results, returning what we have 11792 1727096165.97273: in VariableManager get_vars() 11792 1727096165.97318: Calling all_inventory to load vars for managed_node2 11792 1727096165.97320: Calling groups_inventory to load vars for managed_node2 11792 1727096165.97323: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096165.97337: Calling all_plugins_play to load vars for managed_node2 11792 1727096165.97340: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096165.97343: Calling groups_plugins_play to load vars for managed_node2 11792 1727096165.98154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096165.99238: done with get_vars() 11792 1727096165.99261: variable 'ansible_search_path' from source: unknown 11792 1727096165.99263: variable 'ansible_search_path' from source: unknown 11792 1727096165.99387: we have included files to process 11792 1727096165.99388: generating all_blocks data 11792 1727096165.99390: done generating all_blocks data 11792 1727096165.99395: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096165.99396: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096165.99398: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11792 1727096165.99662: done processing included file 11792 1727096165.99665: iterating over new_blocks loaded from include file 11792 1727096165.99666: in VariableManager get_vars() 11792 1727096165.99689: done with get_vars() 11792 1727096165.99691: filtering new block on tags 11792 1727096165.99752: done filtering new block on tags 11792 1727096165.99755: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11792 1727096165.99784: extending task lists for all hosts with included blocks 11792 1727096166.00252: done extending task lists 11792 1727096166.00253: done processing included files 11792 1727096166.00254: results queue empty 11792 1727096166.00255: checking for any_errors_fatal 11792 1727096166.00258: done checking for any_errors_fatal 11792 1727096166.00259: checking for max_fail_percentage 11792 1727096166.00260: done checking for max_fail_percentage 11792 1727096166.00261: checking to see if all hosts have failed and the running result is not ok 11792 1727096166.00262: done checking to see if all hosts have failed 11792 1727096166.00262: getting the remaining hosts for this loop 11792 1727096166.00264: done getting the remaining hosts for this loop 11792 1727096166.00266: getting the next task for host managed_node2 11792 1727096166.00286: done getting next task for host managed_node2 11792 1727096166.00288: ^ task is: TASK: Get stat for interface {{ interface }} 11792 1727096166.00292: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096166.00295: getting variables 11792 1727096166.00296: in VariableManager get_vars() 11792 1727096166.00309: Calling all_inventory to load vars for managed_node2 11792 1727096166.00311: Calling groups_inventory to load vars for managed_node2 11792 1727096166.00314: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096166.00319: Calling all_plugins_play to load vars for managed_node2 11792 1727096166.00322: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096166.00325: Calling groups_plugins_play to load vars for managed_node2 11792 1727096166.01437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096166.02304: done with get_vars() 11792 1727096166.02322: done getting variables 11792 1727096166.02446: variable 'interface' from source: task vars 11792 1727096166.02450: variable 'dhcp_interface2' from source: play vars 11792 1727096166.02494: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:56:06 -0400 (0:00:00.066) 0:00:48.304 ****** 11792 1727096166.02519: entering _queue_task() for managed_node2/stat 11792 1727096166.02793: worker is 1 (out of 1 available) 11792 1727096166.02807: exiting _queue_task() for managed_node2/stat 11792 1727096166.02819: done queuing things up, now waiting for results queue to drain 11792 1727096166.02821: waiting for pending results... 11792 1727096166.03009: running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 11792 1727096166.03104: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a01 11792 1727096166.03116: variable 'ansible_search_path' from source: unknown 11792 1727096166.03119: variable 'ansible_search_path' from source: unknown 11792 1727096166.03148: calling self._execute() 11792 1727096166.03223: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096166.03226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096166.03235: variable 'omit' from source: magic vars 11792 1727096166.03511: variable 'ansible_distribution_major_version' from source: facts 11792 1727096166.03521: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096166.03527: variable 'omit' from source: magic vars 11792 1727096166.03575: variable 'omit' from source: magic vars 11792 1727096166.03673: variable 'interface' from source: task vars 11792 1727096166.03677: variable 'dhcp_interface2' from source: play vars 11792 1727096166.03986: variable 'dhcp_interface2' from source: play vars 11792 1727096166.03993: variable 'omit' from source: magic vars 11792 1727096166.03997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096166.04000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096166.04002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096166.04005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096166.04007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096166.04036: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096166.04076: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096166.04088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096166.04428: Set connection var ansible_timeout to 10 11792 1727096166.04456: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096166.04495: Set connection var ansible_shell_executable to /bin/sh 11792 1727096166.04529: Set connection var ansible_pipelining to False 11792 1727096166.04532: Set connection var ansible_shell_type to sh 11792 1727096166.04544: Set connection var ansible_connection to ssh 11792 1727096166.04607: variable 'ansible_shell_executable' from source: unknown 11792 1727096166.04690: variable 'ansible_connection' from source: unknown 11792 1727096166.04703: variable 'ansible_module_compression' from source: unknown 11792 1727096166.04706: variable 'ansible_shell_type' from source: unknown 11792 1727096166.04709: variable 'ansible_shell_executable' from source: unknown 11792 1727096166.04711: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096166.04713: variable 'ansible_pipelining' from source: unknown 11792 1727096166.04716: variable 'ansible_timeout' from source: unknown 11792 1727096166.04718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096166.04921: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096166.04931: variable 'omit' from source: magic vars 11792 1727096166.04940: starting attempt loop 11792 1727096166.04944: running the handler 11792 1727096166.04972: _low_level_execute_command(): starting 11792 1727096166.04983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096166.05589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096166.05593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096166.05597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.05664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096166.05671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096166.05677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096166.05722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096166.07405: stdout chunk (state=3): >>>/root <<< 11792 1727096166.07503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096166.07536: stderr chunk (state=3): >>><<< 11792 1727096166.07540: stdout chunk (state=3): >>><<< 11792 1727096166.07565: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096166.07587: _low_level_execute_command(): starting 11792 1727096166.07591: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831 `" && echo ansible-tmp-1727096166.0756638-14077-103059236132831="` echo /root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831 `" ) && sleep 0' 11792 1727096166.08201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096166.08204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096166.08207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.08216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096166.08218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096166.08221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.08265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096166.08281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096166.08309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096166.08387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096166.10380: stdout chunk (state=3): >>>ansible-tmp-1727096166.0756638-14077-103059236132831=/root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831 <<< 11792 1727096166.10507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096166.10543: stderr chunk (state=3): >>><<< 11792 1727096166.10547: stdout chunk (state=3): >>><<< 11792 1727096166.10582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096166.0756638-14077-103059236132831=/root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096166.10622: variable 'ansible_module_compression' from source: unknown 11792 1727096166.10695: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11792 1727096166.10728: variable 'ansible_facts' from source: unknown 11792 1727096166.10810: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/AnsiballZ_stat.py 11792 1727096166.10943: Sending initial data 11792 1727096166.10947: Sent initial data (153 bytes) 11792 1727096166.11575: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096166.11579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096166.11581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.11584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096166.11586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.11639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096166.11646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096166.11681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096166.13310: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096166.13341: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096166.13375: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpaze9j23j /root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/AnsiballZ_stat.py <<< 11792 1727096166.13388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/AnsiballZ_stat.py" <<< 11792 1727096166.13408: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpaze9j23j" to remote "/root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/AnsiballZ_stat.py" <<< 11792 1727096166.13419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/AnsiballZ_stat.py" <<< 11792 1727096166.13920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096166.13966: stderr chunk (state=3): >>><<< 11792 1727096166.13972: stdout chunk (state=3): >>><<< 11792 1727096166.14016: done transferring module to remote 11792 1727096166.14025: _low_level_execute_command(): starting 11792 1727096166.14030: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/ /root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/AnsiballZ_stat.py && sleep 0' 11792 1727096166.14496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096166.14499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.14502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096166.14509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.14557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096166.14562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096166.14596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096166.16455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096166.16508: stderr chunk (state=3): >>><<< 11792 1727096166.16515: stdout chunk (state=3): >>><<< 11792 1727096166.16544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096166.16548: _low_level_execute_command(): starting 11792 1727096166.16551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/AnsiballZ_stat.py && sleep 0' 11792 1727096166.17173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096166.17177: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096166.17179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.17199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096166.17202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096166.17211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096166.17269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096166.33090: stdout chunk (state=3): >>> <<< 11792 1727096166.33117: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28187, "dev": 23, "nlink": 1, "atime": 1727096163.9310856, "mtime": 1727096163.9310856, "ctime": 1727096163.9310856, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11792 1727096166.34576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096166.34580: stdout chunk (state=3): >>><<< 11792 1727096166.34582: stderr chunk (state=3): >>><<< 11792 1727096166.34585: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28187, "dev": 23, "nlink": 1, "atime": 1727096163.9310856, "mtime": 1727096163.9310856, "ctime": 1727096163.9310856, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096166.34746: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096166.34966: _low_level_execute_command(): starting 11792 1727096166.34972: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096166.0756638-14077-103059236132831/ > /dev/null 2>&1 && sleep 0' 11792 1727096166.35620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096166.35628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096166.35637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096166.35661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096166.35689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096166.35693: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096166.35695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.35715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096166.35718: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096166.35723: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096166.35775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096166.35778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096166.35781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096166.35783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096166.35787: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096166.35791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096166.35840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096166.35848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096166.35960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096166.37899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096166.37903: stdout chunk (state=3): >>><<< 11792 1727096166.37905: stderr chunk (state=3): >>><<< 11792 1727096166.37921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096166.37933: handler run complete 11792 1727096166.38088: attempt loop complete, returning result 11792 1727096166.38091: _execute() done 11792 1727096166.38093: dumping result to json 11792 1727096166.38098: done dumping result, returning 11792 1727096166.38101: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 [0afff68d-5257-d9c7-3fc0-000000000a01] 11792 1727096166.38103: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a01 11792 1727096166.38217: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a01 ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096163.9310856, "block_size": 4096, "blocks": 0, "ctime": 1727096163.9310856, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28187, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727096163.9310856, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11792 1727096166.38454: no more pending results, returning what we have 11792 1727096166.38458: results queue empty 11792 1727096166.38459: checking for any_errors_fatal 11792 1727096166.38462: done checking for any_errors_fatal 11792 1727096166.38463: checking for max_fail_percentage 11792 1727096166.38464: done checking for max_fail_percentage 11792 1727096166.38466: checking to see if all hosts have failed and the running result is not ok 11792 1727096166.38466: done checking to see if all hosts have failed 11792 1727096166.38469: getting the remaining hosts for this loop 11792 1727096166.38471: done getting the remaining hosts for this loop 11792 1727096166.38475: getting the next task for host managed_node2 11792 1727096166.38609: done getting next task for host managed_node2 11792 1727096166.38612: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11792 1727096166.38616: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096166.38621: getting variables 11792 1727096166.38622: in VariableManager get_vars() 11792 1727096166.38715: Calling all_inventory to load vars for managed_node2 11792 1727096166.38719: Calling groups_inventory to load vars for managed_node2 11792 1727096166.38722: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096166.38821: Calling all_plugins_play to load vars for managed_node2 11792 1727096166.38825: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096166.38833: Calling groups_plugins_play to load vars for managed_node2 11792 1727096166.39887: WORKER PROCESS EXITING 11792 1727096166.41414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096166.44911: done with get_vars() 11792 1727096166.44942: done getting variables 11792 1727096166.45031: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096166.45161: variable 'interface' from source: task vars 11792 1727096166.45165: variable 'dhcp_interface2' from source: play vars 11792 1727096166.45224: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:56:06 -0400 (0:00:00.427) 0:00:48.731 ****** 11792 1727096166.45265: entering _queue_task() for managed_node2/assert 11792 1727096166.45764: worker is 1 (out of 1 available) 11792 1727096166.45778: exiting _queue_task() for managed_node2/assert 11792 1727096166.45792: done queuing things up, now waiting for results queue to drain 11792 1727096166.45800: waiting for pending results... 11792 1727096166.46134: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' 11792 1727096166.46272: in run() - task 0afff68d-5257-d9c7-3fc0-000000000984 11792 1727096166.46321: variable 'ansible_search_path' from source: unknown 11792 1727096166.46328: variable 'ansible_search_path' from source: unknown 11792 1727096166.46383: calling self._execute() 11792 1727096166.46492: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096166.46565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096166.46574: variable 'omit' from source: magic vars 11792 1727096166.46909: variable 'ansible_distribution_major_version' from source: facts 11792 1727096166.46925: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096166.46937: variable 'omit' from source: magic vars 11792 1727096166.47025: variable 'omit' from source: magic vars 11792 1727096166.47229: variable 'interface' from source: task vars 11792 1727096166.47240: variable 'dhcp_interface2' from source: play vars 11792 1727096166.47302: variable 'dhcp_interface2' from source: play vars 11792 1727096166.47352: variable 'omit' from source: magic vars 11792 1727096166.47442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096166.47451: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096166.47516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096166.47680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096166.47684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096166.47687: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096166.47689: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096166.47692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096166.47815: Set connection var ansible_timeout to 10 11792 1727096166.47829: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096166.47842: Set connection var ansible_shell_executable to /bin/sh 11792 1727096166.47851: Set connection var ansible_pipelining to False 11792 1727096166.47858: Set connection var ansible_shell_type to sh 11792 1727096166.47875: Set connection var ansible_connection to ssh 11792 1727096166.47973: variable 'ansible_shell_executable' from source: unknown 11792 1727096166.47984: variable 'ansible_connection' from source: unknown 11792 1727096166.47986: variable 'ansible_module_compression' from source: unknown 11792 1727096166.47989: variable 'ansible_shell_type' from source: unknown 11792 1727096166.47991: variable 'ansible_shell_executable' from source: unknown 11792 1727096166.47992: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096166.47994: variable 'ansible_pipelining' from source: unknown 11792 1727096166.47996: variable 'ansible_timeout' from source: unknown 11792 1727096166.47998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096166.48200: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096166.48203: variable 'omit' from source: magic vars 11792 1727096166.48205: starting attempt loop 11792 1727096166.48207: running the handler 11792 1727096166.48342: variable 'interface_stat' from source: set_fact 11792 1727096166.48365: Evaluated conditional (interface_stat.stat.exists): True 11792 1727096166.48417: handler run complete 11792 1727096166.48420: attempt loop complete, returning result 11792 1727096166.48422: _execute() done 11792 1727096166.48424: dumping result to json 11792 1727096166.48426: done dumping result, returning 11792 1727096166.48428: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' [0afff68d-5257-d9c7-3fc0-000000000984] 11792 1727096166.48429: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000984 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11792 1727096166.48590: no more pending results, returning what we have 11792 1727096166.48595: results queue empty 11792 1727096166.48595: checking for any_errors_fatal 11792 1727096166.48608: done checking for any_errors_fatal 11792 1727096166.48609: checking for max_fail_percentage 11792 1727096166.48611: done checking for max_fail_percentage 11792 1727096166.48612: checking to see if all hosts have failed and the running result is not ok 11792 1727096166.48613: done checking to see if all hosts have failed 11792 1727096166.48614: getting the remaining hosts for this loop 11792 1727096166.48615: done getting the remaining hosts for this loop 11792 1727096166.48619: getting the next task for host managed_node2 11792 1727096166.48751: done getting next task for host managed_node2 11792 1727096166.48755: ^ task is: TASK: Test 11792 1727096166.48760: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096166.48770: getting variables 11792 1727096166.48772: in VariableManager get_vars() 11792 1727096166.48827: Calling all_inventory to load vars for managed_node2 11792 1727096166.48830: Calling groups_inventory to load vars for managed_node2 11792 1727096166.48833: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096166.49102: Calling all_plugins_play to load vars for managed_node2 11792 1727096166.49108: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096166.49112: Calling groups_plugins_play to load vars for managed_node2 11792 1727096166.49746: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000984 11792 1727096166.49750: WORKER PROCESS EXITING 11792 1727096166.51757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096166.55296: done with get_vars() 11792 1727096166.55333: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Monday 23 September 2024 08:56:06 -0400 (0:00:00.101) 0:00:48.833 ****** 11792 1727096166.55440: entering _queue_task() for managed_node2/include_tasks 11792 1727096166.55931: worker is 1 (out of 1 available) 11792 1727096166.55942: exiting _queue_task() for managed_node2/include_tasks 11792 1727096166.55955: done queuing things up, now waiting for results queue to drain 11792 1727096166.55956: waiting for pending results... 11792 1727096166.56166: running TaskExecutor() for managed_node2/TASK: Test 11792 1727096166.56281: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008ee 11792 1727096166.56307: variable 'ansible_search_path' from source: unknown 11792 1727096166.56315: variable 'ansible_search_path' from source: unknown 11792 1727096166.56374: variable 'lsr_test' from source: include params 11792 1727096166.56974: variable 'lsr_test' from source: include params 11792 1727096166.57133: variable 'omit' from source: magic vars 11792 1727096166.57384: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096166.57550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096166.57738: variable 'omit' from source: magic vars 11792 1727096166.58371: variable 'ansible_distribution_major_version' from source: facts 11792 1727096166.58617: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096166.58622: variable 'item' from source: unknown 11792 1727096166.58679: variable 'item' from source: unknown 11792 1727096166.58951: variable 'item' from source: unknown 11792 1727096166.58954: variable 'item' from source: unknown 11792 1727096166.59204: dumping result to json 11792 1727096166.59412: done dumping result, returning 11792 1727096166.59415: done running TaskExecutor() for managed_node2/TASK: Test [0afff68d-5257-d9c7-3fc0-0000000008ee] 11792 1727096166.59418: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ee 11792 1727096166.59462: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ee 11792 1727096166.59465: WORKER PROCESS EXITING 11792 1727096166.59539: no more pending results, returning what we have 11792 1727096166.59545: in VariableManager get_vars() 11792 1727096166.59675: Calling all_inventory to load vars for managed_node2 11792 1727096166.59679: Calling groups_inventory to load vars for managed_node2 11792 1727096166.59682: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096166.59698: Calling all_plugins_play to load vars for managed_node2 11792 1727096166.59705: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096166.59709: Calling groups_plugins_play to load vars for managed_node2 11792 1727096166.63430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096166.67095: done with get_vars() 11792 1727096166.67247: variable 'ansible_search_path' from source: unknown 11792 1727096166.67249: variable 'ansible_search_path' from source: unknown 11792 1727096166.67295: we have included files to process 11792 1727096166.67296: generating all_blocks data 11792 1727096166.67298: done generating all_blocks data 11792 1727096166.67303: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11792 1727096166.67305: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11792 1727096166.67307: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11792 1727096166.67719: in VariableManager get_vars() 11792 1727096166.67745: done with get_vars() 11792 1727096166.67750: variable 'omit' from source: magic vars 11792 1727096166.67799: variable 'omit' from source: magic vars 11792 1727096166.67855: in VariableManager get_vars() 11792 1727096166.67871: done with get_vars() 11792 1727096166.67903: in VariableManager get_vars() 11792 1727096166.67922: done with get_vars() 11792 1727096166.67957: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11792 1727096166.68159: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11792 1727096166.68247: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11792 1727096166.68656: in VariableManager get_vars() 11792 1727096166.68681: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11792 1727096166.71951: done processing included file 11792 1727096166.71954: iterating over new_blocks loaded from include file 11792 1727096166.71955: in VariableManager get_vars() 11792 1727096166.71995: done with get_vars() 11792 1727096166.71997: filtering new block on tags 11792 1727096166.72323: done filtering new block on tags 11792 1727096166.72327: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml for managed_node2 => (item=tasks/create_bond_profile_reconfigure.yml) 11792 1727096166.72333: extending task lists for all hosts with included blocks 11792 1727096166.75405: done extending task lists 11792 1727096166.75408: done processing included files 11792 1727096166.75409: results queue empty 11792 1727096166.75409: checking for any_errors_fatal 11792 1727096166.75413: done checking for any_errors_fatal 11792 1727096166.75414: checking for max_fail_percentage 11792 1727096166.75415: done checking for max_fail_percentage 11792 1727096166.75416: checking to see if all hosts have failed and the running result is not ok 11792 1727096166.75417: done checking to see if all hosts have failed 11792 1727096166.75417: getting the remaining hosts for this loop 11792 1727096166.75419: done getting the remaining hosts for this loop 11792 1727096166.75421: getting the next task for host managed_node2 11792 1727096166.75542: done getting next task for host managed_node2 11792 1727096166.75547: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11792 1727096166.75550: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096166.75563: getting variables 11792 1727096166.75564: in VariableManager get_vars() 11792 1727096166.75586: Calling all_inventory to load vars for managed_node2 11792 1727096166.75589: Calling groups_inventory to load vars for managed_node2 11792 1727096166.75591: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096166.75597: Calling all_plugins_play to load vars for managed_node2 11792 1727096166.75599: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096166.75602: Calling groups_plugins_play to load vars for managed_node2 11792 1727096166.78487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096166.81682: done with get_vars() 11792 1727096166.81830: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:56:06 -0400 (0:00:00.265) 0:00:49.099 ****** 11792 1727096166.81990: entering _queue_task() for managed_node2/include_tasks 11792 1727096166.82826: worker is 1 (out of 1 available) 11792 1727096166.82840: exiting _queue_task() for managed_node2/include_tasks 11792 1727096166.82855: done queuing things up, now waiting for results queue to drain 11792 1727096166.82857: waiting for pending results... 11792 1727096166.83690: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11792 1727096166.84133: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a2e 11792 1727096166.84148: variable 'ansible_search_path' from source: unknown 11792 1727096166.84152: variable 'ansible_search_path' from source: unknown 11792 1727096166.84193: calling self._execute() 11792 1727096166.84521: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096166.84529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096166.84537: variable 'omit' from source: magic vars 11792 1727096166.85944: variable 'ansible_distribution_major_version' from source: facts 11792 1727096166.85954: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096166.85966: _execute() done 11792 1727096166.85970: dumping result to json 11792 1727096166.86079: done dumping result, returning 11792 1727096166.86096: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-d9c7-3fc0-000000000a2e] 11792 1727096166.86099: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a2e 11792 1727096166.86255: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a2e 11792 1727096166.86258: WORKER PROCESS EXITING 11792 1727096166.86338: no more pending results, returning what we have 11792 1727096166.86343: in VariableManager get_vars() 11792 1727096166.86398: Calling all_inventory to load vars for managed_node2 11792 1727096166.86401: Calling groups_inventory to load vars for managed_node2 11792 1727096166.86404: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096166.86418: Calling all_plugins_play to load vars for managed_node2 11792 1727096166.86422: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096166.86425: Calling groups_plugins_play to load vars for managed_node2 11792 1727096166.90016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096166.94316: done with get_vars() 11792 1727096166.94350: variable 'ansible_search_path' from source: unknown 11792 1727096166.94351: variable 'ansible_search_path' from source: unknown 11792 1727096166.94474: we have included files to process 11792 1727096166.94476: generating all_blocks data 11792 1727096166.94478: done generating all_blocks data 11792 1727096166.94479: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096166.94480: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096166.94483: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096166.95703: done processing included file 11792 1727096166.95706: iterating over new_blocks loaded from include file 11792 1727096166.95707: in VariableManager get_vars() 11792 1727096166.95854: done with get_vars() 11792 1727096166.95857: filtering new block on tags 11792 1727096166.95893: done filtering new block on tags 11792 1727096166.95897: in VariableManager get_vars() 11792 1727096166.95926: done with get_vars() 11792 1727096166.96040: filtering new block on tags 11792 1727096166.96089: done filtering new block on tags 11792 1727096166.96092: in VariableManager get_vars() 11792 1727096166.96119: done with get_vars() 11792 1727096166.96121: filtering new block on tags 11792 1727096166.96283: done filtering new block on tags 11792 1727096166.96322: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11792 1727096166.96329: extending task lists for all hosts with included blocks 11792 1727096166.99810: done extending task lists 11792 1727096166.99812: done processing included files 11792 1727096166.99813: results queue empty 11792 1727096166.99814: checking for any_errors_fatal 11792 1727096166.99818: done checking for any_errors_fatal 11792 1727096166.99819: checking for max_fail_percentage 11792 1727096166.99820: done checking for max_fail_percentage 11792 1727096166.99821: checking to see if all hosts have failed and the running result is not ok 11792 1727096166.99822: done checking to see if all hosts have failed 11792 1727096166.99822: getting the remaining hosts for this loop 11792 1727096166.99824: done getting the remaining hosts for this loop 11792 1727096166.99826: getting the next task for host managed_node2 11792 1727096166.99832: done getting next task for host managed_node2 11792 1727096166.99839: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11792 1727096166.99844: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096166.99856: getting variables 11792 1727096166.99857: in VariableManager get_vars() 11792 1727096166.99882: Calling all_inventory to load vars for managed_node2 11792 1727096166.99884: Calling groups_inventory to load vars for managed_node2 11792 1727096166.99886: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096166.99892: Calling all_plugins_play to load vars for managed_node2 11792 1727096166.99895: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096166.99898: Calling groups_plugins_play to load vars for managed_node2 11792 1727096167.01941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096167.03547: done with get_vars() 11792 1727096167.03580: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:56:07 -0400 (0:00:00.217) 0:00:49.316 ****** 11792 1727096167.03728: entering _queue_task() for managed_node2/setup 11792 1727096167.04396: worker is 1 (out of 1 available) 11792 1727096167.04406: exiting _queue_task() for managed_node2/setup 11792 1727096167.04417: done queuing things up, now waiting for results queue to drain 11792 1727096167.04419: waiting for pending results... 11792 1727096167.04659: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11792 1727096167.04732: in run() - task 0afff68d-5257-d9c7-3fc0-000000000b10 11792 1727096167.04761: variable 'ansible_search_path' from source: unknown 11792 1727096167.04771: variable 'ansible_search_path' from source: unknown 11792 1727096167.04810: calling self._execute() 11792 1727096167.04917: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096167.04929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096167.04944: variable 'omit' from source: magic vars 11792 1727096167.05392: variable 'ansible_distribution_major_version' from source: facts 11792 1727096167.05417: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096167.05858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096167.08286: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096167.08369: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096167.08418: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096167.08463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096167.08496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096167.08591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096167.08632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096167.08672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096167.08714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096167.08736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096167.08799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096167.08824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096167.08857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096167.08955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096167.08958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096167.09098: variable '__network_required_facts' from source: role '' defaults 11792 1727096167.09113: variable 'ansible_facts' from source: unknown 11792 1727096167.09924: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11792 1727096167.09939: when evaluation is False, skipping this task 11792 1727096167.09946: _execute() done 11792 1727096167.09952: dumping result to json 11792 1727096167.09969: done dumping result, returning 11792 1727096167.10123: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-d9c7-3fc0-000000000b10] 11792 1727096167.10126: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b10 11792 1727096167.10196: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b10 11792 1727096167.10200: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096167.10252: no more pending results, returning what we have 11792 1727096167.10257: results queue empty 11792 1727096167.10258: checking for any_errors_fatal 11792 1727096167.10260: done checking for any_errors_fatal 11792 1727096167.10261: checking for max_fail_percentage 11792 1727096167.10262: done checking for max_fail_percentage 11792 1727096167.10263: checking to see if all hosts have failed and the running result is not ok 11792 1727096167.10264: done checking to see if all hosts have failed 11792 1727096167.10265: getting the remaining hosts for this loop 11792 1727096167.10266: done getting the remaining hosts for this loop 11792 1727096167.10272: getting the next task for host managed_node2 11792 1727096167.10284: done getting next task for host managed_node2 11792 1727096167.10288: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11792 1727096167.10294: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096167.10318: getting variables 11792 1727096167.10320: in VariableManager get_vars() 11792 1727096167.10606: Calling all_inventory to load vars for managed_node2 11792 1727096167.10611: Calling groups_inventory to load vars for managed_node2 11792 1727096167.10615: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096167.10626: Calling all_plugins_play to load vars for managed_node2 11792 1727096167.10630: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096167.10641: Calling groups_plugins_play to load vars for managed_node2 11792 1727096167.12311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096167.14117: done with get_vars() 11792 1727096167.14148: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:56:07 -0400 (0:00:00.105) 0:00:49.421 ****** 11792 1727096167.14277: entering _queue_task() for managed_node2/stat 11792 1727096167.14665: worker is 1 (out of 1 available) 11792 1727096167.14681: exiting _queue_task() for managed_node2/stat 11792 1727096167.14694: done queuing things up, now waiting for results queue to drain 11792 1727096167.14696: waiting for pending results... 11792 1727096167.15045: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11792 1727096167.15179: in run() - task 0afff68d-5257-d9c7-3fc0-000000000b12 11792 1727096167.15190: variable 'ansible_search_path' from source: unknown 11792 1727096167.15194: variable 'ansible_search_path' from source: unknown 11792 1727096167.15222: calling self._execute() 11792 1727096167.15325: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096167.15328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096167.15332: variable 'omit' from source: magic vars 11792 1727096167.15711: variable 'ansible_distribution_major_version' from source: facts 11792 1727096167.15715: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096167.15900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096167.16124: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096167.16154: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096167.16199: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096167.16256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096167.16338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096167.16391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096167.16394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096167.16435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096167.16523: variable '__network_is_ostree' from source: set_fact 11792 1727096167.16529: Evaluated conditional (not __network_is_ostree is defined): False 11792 1727096167.16532: when evaluation is False, skipping this task 11792 1727096167.16534: _execute() done 11792 1727096167.16538: dumping result to json 11792 1727096167.16540: done dumping result, returning 11792 1727096167.16543: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-d9c7-3fc0-000000000b12] 11792 1727096167.16582: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b12 11792 1727096167.16655: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b12 11792 1727096167.16658: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11792 1727096167.16726: no more pending results, returning what we have 11792 1727096167.16730: results queue empty 11792 1727096167.16730: checking for any_errors_fatal 11792 1727096167.16742: done checking for any_errors_fatal 11792 1727096167.16743: checking for max_fail_percentage 11792 1727096167.16744: done checking for max_fail_percentage 11792 1727096167.16745: checking to see if all hosts have failed and the running result is not ok 11792 1727096167.16746: done checking to see if all hosts have failed 11792 1727096167.16746: getting the remaining hosts for this loop 11792 1727096167.16748: done getting the remaining hosts for this loop 11792 1727096167.16751: getting the next task for host managed_node2 11792 1727096167.16759: done getting next task for host managed_node2 11792 1727096167.16763: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11792 1727096167.16770: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096167.16793: getting variables 11792 1727096167.16794: in VariableManager get_vars() 11792 1727096167.16832: Calling all_inventory to load vars for managed_node2 11792 1727096167.16834: Calling groups_inventory to load vars for managed_node2 11792 1727096167.16836: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096167.16845: Calling all_plugins_play to load vars for managed_node2 11792 1727096167.16848: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096167.16850: Calling groups_plugins_play to load vars for managed_node2 11792 1727096167.18397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096167.20417: done with get_vars() 11792 1727096167.20452: done getting variables 11792 1727096167.20517: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:56:07 -0400 (0:00:00.062) 0:00:49.484 ****** 11792 1727096167.20555: entering _queue_task() for managed_node2/set_fact 11792 1727096167.20907: worker is 1 (out of 1 available) 11792 1727096167.20921: exiting _queue_task() for managed_node2/set_fact 11792 1727096167.20935: done queuing things up, now waiting for results queue to drain 11792 1727096167.20936: waiting for pending results... 11792 1727096167.21186: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11792 1727096167.21279: in run() - task 0afff68d-5257-d9c7-3fc0-000000000b13 11792 1727096167.21291: variable 'ansible_search_path' from source: unknown 11792 1727096167.21298: variable 'ansible_search_path' from source: unknown 11792 1727096167.21329: calling self._execute() 11792 1727096167.21432: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096167.21436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096167.21439: variable 'omit' from source: magic vars 11792 1727096167.21780: variable 'ansible_distribution_major_version' from source: facts 11792 1727096167.21790: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096167.21907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096167.22281: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096167.22284: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096167.22287: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096167.22346: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096167.22557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096167.22590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096167.22635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096167.22666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096167.22818: variable '__network_is_ostree' from source: set_fact 11792 1727096167.22845: Evaluated conditional (not __network_is_ostree is defined): False 11792 1727096167.22854: when evaluation is False, skipping this task 11792 1727096167.22862: _execute() done 11792 1727096167.22872: dumping result to json 11792 1727096167.22964: done dumping result, returning 11792 1727096167.22983: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-d9c7-3fc0-000000000b13] 11792 1727096167.22995: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b13 11792 1727096167.23134: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b13 11792 1727096167.23137: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11792 1727096167.23214: no more pending results, returning what we have 11792 1727096167.23218: results queue empty 11792 1727096167.23219: checking for any_errors_fatal 11792 1727096167.23225: done checking for any_errors_fatal 11792 1727096167.23225: checking for max_fail_percentage 11792 1727096167.23227: done checking for max_fail_percentage 11792 1727096167.23228: checking to see if all hosts have failed and the running result is not ok 11792 1727096167.23229: done checking to see if all hosts have failed 11792 1727096167.23229: getting the remaining hosts for this loop 11792 1727096167.23231: done getting the remaining hosts for this loop 11792 1727096167.23234: getting the next task for host managed_node2 11792 1727096167.23245: done getting next task for host managed_node2 11792 1727096167.23248: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11792 1727096167.23256: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096167.23281: getting variables 11792 1727096167.23283: in VariableManager get_vars() 11792 1727096167.23325: Calling all_inventory to load vars for managed_node2 11792 1727096167.23328: Calling groups_inventory to load vars for managed_node2 11792 1727096167.23330: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096167.23338: Calling all_plugins_play to load vars for managed_node2 11792 1727096167.23341: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096167.23343: Calling groups_plugins_play to load vars for managed_node2 11792 1727096167.24466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096167.26256: done with get_vars() 11792 1727096167.26294: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:56:07 -0400 (0:00:00.058) 0:00:49.543 ****** 11792 1727096167.26418: entering _queue_task() for managed_node2/service_facts 11792 1727096167.26840: worker is 1 (out of 1 available) 11792 1727096167.26853: exiting _queue_task() for managed_node2/service_facts 11792 1727096167.26871: done queuing things up, now waiting for results queue to drain 11792 1727096167.26873: waiting for pending results... 11792 1727096167.27261: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11792 1727096167.27460: in run() - task 0afff68d-5257-d9c7-3fc0-000000000b15 11792 1727096167.27464: variable 'ansible_search_path' from source: unknown 11792 1727096167.27472: variable 'ansible_search_path' from source: unknown 11792 1727096167.27479: calling self._execute() 11792 1727096167.27612: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096167.27616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096167.27618: variable 'omit' from source: magic vars 11792 1727096167.28024: variable 'ansible_distribution_major_version' from source: facts 11792 1727096167.28162: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096167.28166: variable 'omit' from source: magic vars 11792 1727096167.28626: variable 'omit' from source: magic vars 11792 1727096167.28780: variable 'omit' from source: magic vars 11792 1727096167.28784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096167.28822: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096167.28850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096167.28997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096167.29088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096167.29149: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096167.29193: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096167.29196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096167.29367: Set connection var ansible_timeout to 10 11792 1727096167.29458: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096167.29530: Set connection var ansible_shell_executable to /bin/sh 11792 1727096167.29534: Set connection var ansible_pipelining to False 11792 1727096167.29536: Set connection var ansible_shell_type to sh 11792 1727096167.29538: Set connection var ansible_connection to ssh 11792 1727096167.29632: variable 'ansible_shell_executable' from source: unknown 11792 1727096167.29635: variable 'ansible_connection' from source: unknown 11792 1727096167.29637: variable 'ansible_module_compression' from source: unknown 11792 1727096167.29638: variable 'ansible_shell_type' from source: unknown 11792 1727096167.29640: variable 'ansible_shell_executable' from source: unknown 11792 1727096167.29664: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096167.29668: variable 'ansible_pipelining' from source: unknown 11792 1727096167.29734: variable 'ansible_timeout' from source: unknown 11792 1727096167.29738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096167.29934: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096167.29944: variable 'omit' from source: magic vars 11792 1727096167.29950: starting attempt loop 11792 1727096167.29952: running the handler 11792 1727096167.29981: _low_level_execute_command(): starting 11792 1727096167.29993: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096167.30629: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096167.30691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096167.30696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096167.30699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096167.30750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096167.32430: stdout chunk (state=3): >>>/root <<< 11792 1727096167.32527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096167.32567: stderr chunk (state=3): >>><<< 11792 1727096167.32572: stdout chunk (state=3): >>><<< 11792 1727096167.32586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096167.32602: _low_level_execute_command(): starting 11792 1727096167.32610: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419 `" && echo ansible-tmp-1727096167.3258834-14130-25740858789419="` echo /root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419 `" ) && sleep 0' 11792 1727096167.33050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096167.33089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096167.33093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096167.33098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096167.33112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096167.33182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096167.33209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096167.33261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096167.35178: stdout chunk (state=3): >>>ansible-tmp-1727096167.3258834-14130-25740858789419=/root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419 <<< 11792 1727096167.35284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096167.35318: stderr chunk (state=3): >>><<< 11792 1727096167.35321: stdout chunk (state=3): >>><<< 11792 1727096167.35335: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096167.3258834-14130-25740858789419=/root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096167.35377: variable 'ansible_module_compression' from source: unknown 11792 1727096167.35419: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11792 1727096167.35451: variable 'ansible_facts' from source: unknown 11792 1727096167.35512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/AnsiballZ_service_facts.py 11792 1727096167.35627: Sending initial data 11792 1727096167.35630: Sent initial data (161 bytes) 11792 1727096167.36178: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096167.36183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096167.36187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096167.36255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096167.36262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096167.36322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096167.37926: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096167.37960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096167.38001: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpgv1tvay3 /root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/AnsiballZ_service_facts.py <<< 11792 1727096167.38005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/AnsiballZ_service_facts.py" <<< 11792 1727096167.38042: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpgv1tvay3" to remote "/root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/AnsiballZ_service_facts.py" <<< 11792 1727096167.38665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096167.38703: stderr chunk (state=3): >>><<< 11792 1727096167.38706: stdout chunk (state=3): >>><<< 11792 1727096167.38778: done transferring module to remote 11792 1727096167.38788: _low_level_execute_command(): starting 11792 1727096167.38793: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/ /root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/AnsiballZ_service_facts.py && sleep 0' 11792 1727096167.39424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096167.39429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096167.39432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096167.39435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096167.39437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096167.39439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096167.39450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096167.39507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096167.41360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096167.41393: stderr chunk (state=3): >>><<< 11792 1727096167.41396: stdout chunk (state=3): >>><<< 11792 1727096167.41445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096167.41449: _low_level_execute_command(): starting 11792 1727096167.41451: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/AnsiballZ_service_facts.py && sleep 0' 11792 1727096167.42112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096167.42116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096167.42118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096167.42120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096167.42122: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096167.42176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096167.42179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096167.42226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096169.12019: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11792 1727096169.13677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096169.13681: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 11792 1727096169.13683: stderr chunk (state=3): >>><<< 11792 1727096169.13685: stdout chunk (state=3): >>><<< 11792 1727096169.13689: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096169.14495: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096169.14518: _low_level_execute_command(): starting 11792 1727096169.14540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096167.3258834-14130-25740858789419/ > /dev/null 2>&1 && sleep 0' 11792 1727096169.15285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096169.15310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096169.15330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096169.15447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.15462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096169.15483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096169.15506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096169.15584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096169.17542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096169.17570: stdout chunk (state=3): >>><<< 11792 1727096169.17583: stderr chunk (state=3): >>><<< 11792 1727096169.17602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096169.17613: handler run complete 11792 1727096169.17979: variable 'ansible_facts' from source: unknown 11792 1727096169.17996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096169.18504: variable 'ansible_facts' from source: unknown 11792 1727096169.18644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096169.18847: attempt loop complete, returning result 11792 1727096169.18850: _execute() done 11792 1727096169.18864: dumping result to json 11792 1727096169.18920: done dumping result, returning 11792 1727096169.18929: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-d9c7-3fc0-000000000b15] 11792 1727096169.18933: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b15 11792 1727096169.19820: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b15 11792 1727096169.19825: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096169.19914: no more pending results, returning what we have 11792 1727096169.19917: results queue empty 11792 1727096169.19918: checking for any_errors_fatal 11792 1727096169.19923: done checking for any_errors_fatal 11792 1727096169.19924: checking for max_fail_percentage 11792 1727096169.19926: done checking for max_fail_percentage 11792 1727096169.19927: checking to see if all hosts have failed and the running result is not ok 11792 1727096169.19927: done checking to see if all hosts have failed 11792 1727096169.19928: getting the remaining hosts for this loop 11792 1727096169.19929: done getting the remaining hosts for this loop 11792 1727096169.19932: getting the next task for host managed_node2 11792 1727096169.19940: done getting next task for host managed_node2 11792 1727096169.19943: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11792 1727096169.19949: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096169.19983: getting variables 11792 1727096169.19985: in VariableManager get_vars() 11792 1727096169.20033: Calling all_inventory to load vars for managed_node2 11792 1727096169.20036: Calling groups_inventory to load vars for managed_node2 11792 1727096169.20039: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096169.20048: Calling all_plugins_play to load vars for managed_node2 11792 1727096169.20051: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096169.20059: Calling groups_plugins_play to load vars for managed_node2 11792 1727096169.21675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096169.22556: done with get_vars() 11792 1727096169.22580: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:56:09 -0400 (0:00:01.962) 0:00:51.505 ****** 11792 1727096169.22655: entering _queue_task() for managed_node2/package_facts 11792 1727096169.22923: worker is 1 (out of 1 available) 11792 1727096169.22937: exiting _queue_task() for managed_node2/package_facts 11792 1727096169.22952: done queuing things up, now waiting for results queue to drain 11792 1727096169.22953: waiting for pending results... 11792 1727096169.23143: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11792 1727096169.23256: in run() - task 0afff68d-5257-d9c7-3fc0-000000000b16 11792 1727096169.23272: variable 'ansible_search_path' from source: unknown 11792 1727096169.23276: variable 'ansible_search_path' from source: unknown 11792 1727096169.23308: calling self._execute() 11792 1727096169.23401: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096169.23405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096169.23446: variable 'omit' from source: magic vars 11792 1727096169.23873: variable 'ansible_distribution_major_version' from source: facts 11792 1727096169.23877: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096169.23879: variable 'omit' from source: magic vars 11792 1727096169.23916: variable 'omit' from source: magic vars 11792 1727096169.23998: variable 'omit' from source: magic vars 11792 1727096169.24002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096169.24021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096169.24041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096169.24062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096169.24107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096169.24115: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096169.24121: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096169.24123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096169.24212: Set connection var ansible_timeout to 10 11792 1727096169.24216: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096169.24226: Set connection var ansible_shell_executable to /bin/sh 11792 1727096169.24229: Set connection var ansible_pipelining to False 11792 1727096169.24232: Set connection var ansible_shell_type to sh 11792 1727096169.24234: Set connection var ansible_connection to ssh 11792 1727096169.24253: variable 'ansible_shell_executable' from source: unknown 11792 1727096169.24258: variable 'ansible_connection' from source: unknown 11792 1727096169.24261: variable 'ansible_module_compression' from source: unknown 11792 1727096169.24264: variable 'ansible_shell_type' from source: unknown 11792 1727096169.24266: variable 'ansible_shell_executable' from source: unknown 11792 1727096169.24399: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096169.24403: variable 'ansible_pipelining' from source: unknown 11792 1727096169.24405: variable 'ansible_timeout' from source: unknown 11792 1727096169.24408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096169.24477: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096169.24510: variable 'omit' from source: magic vars 11792 1727096169.24513: starting attempt loop 11792 1727096169.24516: running the handler 11792 1727096169.24642: _low_level_execute_command(): starting 11792 1727096169.24645: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096169.25098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096169.25105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.25109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.25153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096169.25157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096169.25159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096169.25204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096169.26933: stdout chunk (state=3): >>>/root <<< 11792 1727096169.27097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096169.27101: stdout chunk (state=3): >>><<< 11792 1727096169.27103: stderr chunk (state=3): >>><<< 11792 1727096169.27124: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096169.27149: _low_level_execute_command(): starting 11792 1727096169.27249: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666 `" && echo ansible-tmp-1727096169.2713332-14212-32795009295666="` echo /root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666 `" ) && sleep 0' 11792 1727096169.27828: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096169.27844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096169.27863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096169.27890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096169.27908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096169.28045: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.28048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096169.28070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096169.28096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096169.28165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096169.30163: stdout chunk (state=3): >>>ansible-tmp-1727096169.2713332-14212-32795009295666=/root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666 <<< 11792 1727096169.30270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096169.30444: stderr chunk (state=3): >>><<< 11792 1727096169.30448: stdout chunk (state=3): >>><<< 11792 1727096169.30452: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096169.2713332-14212-32795009295666=/root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096169.30457: variable 'ansible_module_compression' from source: unknown 11792 1727096169.30459: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11792 1727096169.30511: variable 'ansible_facts' from source: unknown 11792 1727096169.30694: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/AnsiballZ_package_facts.py 11792 1727096169.30898: Sending initial data 11792 1727096169.30902: Sent initial data (161 bytes) 11792 1727096169.31534: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.31591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096169.31594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096169.31607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096169.31655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096169.33269: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096169.33299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096169.33352: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpdrwzqqke /root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/AnsiballZ_package_facts.py <<< 11792 1727096169.33356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/AnsiballZ_package_facts.py" <<< 11792 1727096169.33430: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpdrwzqqke" to remote "/root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/AnsiballZ_package_facts.py" <<< 11792 1727096169.34851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096169.34895: stderr chunk (state=3): >>><<< 11792 1727096169.34898: stdout chunk (state=3): >>><<< 11792 1727096169.34914: done transferring module to remote 11792 1727096169.34923: _low_level_execute_command(): starting 11792 1727096169.34928: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/ /root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/AnsiballZ_package_facts.py && sleep 0' 11792 1727096169.35775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096169.35780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.35783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.35829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096169.35842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096169.35864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096169.35939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096169.37878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096169.37884: stdout chunk (state=3): >>><<< 11792 1727096169.37887: stderr chunk (state=3): >>><<< 11792 1727096169.37995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096169.38000: _low_level_execute_command(): starting 11792 1727096169.38003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/AnsiballZ_package_facts.py && sleep 0' 11792 1727096169.38722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096169.38752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096169.38789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096169.38847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096169.84363: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11792 1727096169.84383: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 11792 1727096169.84415: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 11792 1727096169.84439: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 11792 1727096169.84455: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 11792 1727096169.84469: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 11792 1727096169.84499: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11792 1727096169.84508: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 11792 1727096169.84514: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11792 1727096169.84539: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11792 1727096169.84555: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 11792 1727096169.84565: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11792 1727096169.86371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096169.86396: stderr chunk (state=3): >>><<< 11792 1727096169.86399: stdout chunk (state=3): >>><<< 11792 1727096169.86436: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096169.93261: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096169.93272: _low_level_execute_command(): starting 11792 1727096169.93275: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096169.2713332-14212-32795009295666/ > /dev/null 2>&1 && sleep 0' 11792 1727096169.94153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096169.94165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.94171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096169.94190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096169.94193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096169.94244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096169.94269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096169.94293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096169.96242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096169.96374: stderr chunk (state=3): >>><<< 11792 1727096169.96378: stdout chunk (state=3): >>><<< 11792 1727096169.96380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096169.96383: handler run complete 11792 1727096169.99229: variable 'ansible_facts' from source: unknown 11792 1727096170.00873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.05442: variable 'ansible_facts' from source: unknown 11792 1727096170.06303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.07283: attempt loop complete, returning result 11792 1727096170.07305: _execute() done 11792 1727096170.07312: dumping result to json 11792 1727096170.07539: done dumping result, returning 11792 1727096170.07553: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-d9c7-3fc0-000000000b16] 11792 1727096170.07563: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b16 11792 1727096170.21542: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000b16 11792 1727096170.21545: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096170.21661: no more pending results, returning what we have 11792 1727096170.21664: results queue empty 11792 1727096170.21665: checking for any_errors_fatal 11792 1727096170.21670: done checking for any_errors_fatal 11792 1727096170.21671: checking for max_fail_percentage 11792 1727096170.21672: done checking for max_fail_percentage 11792 1727096170.21673: checking to see if all hosts have failed and the running result is not ok 11792 1727096170.21674: done checking to see if all hosts have failed 11792 1727096170.21675: getting the remaining hosts for this loop 11792 1727096170.21676: done getting the remaining hosts for this loop 11792 1727096170.21679: getting the next task for host managed_node2 11792 1727096170.21685: done getting next task for host managed_node2 11792 1727096170.21688: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11792 1727096170.21695: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096170.21706: getting variables 11792 1727096170.21707: in VariableManager get_vars() 11792 1727096170.21730: Calling all_inventory to load vars for managed_node2 11792 1727096170.21732: Calling groups_inventory to load vars for managed_node2 11792 1727096170.21734: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096170.21745: Calling all_plugins_play to load vars for managed_node2 11792 1727096170.21747: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096170.21750: Calling groups_plugins_play to load vars for managed_node2 11792 1727096170.23099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.25156: done with get_vars() 11792 1727096170.25187: done getting variables 11792 1727096170.25240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:56:10 -0400 (0:00:01.026) 0:00:52.531 ****** 11792 1727096170.25277: entering _queue_task() for managed_node2/debug 11792 1727096170.25981: worker is 1 (out of 1 available) 11792 1727096170.25988: exiting _queue_task() for managed_node2/debug 11792 1727096170.26000: done queuing things up, now waiting for results queue to drain 11792 1727096170.26002: waiting for pending results... 11792 1727096170.26131: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11792 1727096170.26242: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a2f 11792 1727096170.26272: variable 'ansible_search_path' from source: unknown 11792 1727096170.26281: variable 'ansible_search_path' from source: unknown 11792 1727096170.26324: calling self._execute() 11792 1727096170.26446: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.26449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.26458: variable 'omit' from source: magic vars 11792 1727096170.26883: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.26887: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096170.26890: variable 'omit' from source: magic vars 11792 1727096170.26980: variable 'omit' from source: magic vars 11792 1727096170.27101: variable 'network_provider' from source: set_fact 11792 1727096170.27210: variable 'omit' from source: magic vars 11792 1727096170.27214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096170.27218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096170.27244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096170.27269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096170.27286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096170.27333: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096170.27343: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.27352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.27510: Set connection var ansible_timeout to 10 11792 1727096170.27572: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096170.27758: Set connection var ansible_shell_executable to /bin/sh 11792 1727096170.27761: Set connection var ansible_pipelining to False 11792 1727096170.27764: Set connection var ansible_shell_type to sh 11792 1727096170.27767: Set connection var ansible_connection to ssh 11792 1727096170.27771: variable 'ansible_shell_executable' from source: unknown 11792 1727096170.27773: variable 'ansible_connection' from source: unknown 11792 1727096170.27775: variable 'ansible_module_compression' from source: unknown 11792 1727096170.27780: variable 'ansible_shell_type' from source: unknown 11792 1727096170.27782: variable 'ansible_shell_executable' from source: unknown 11792 1727096170.27784: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.27786: variable 'ansible_pipelining' from source: unknown 11792 1727096170.27788: variable 'ansible_timeout' from source: unknown 11792 1727096170.27790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.28234: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096170.28238: variable 'omit' from source: magic vars 11792 1727096170.28241: starting attempt loop 11792 1727096170.28243: running the handler 11792 1727096170.28246: handler run complete 11792 1727096170.28342: attempt loop complete, returning result 11792 1727096170.28345: _execute() done 11792 1727096170.28348: dumping result to json 11792 1727096170.28350: done dumping result, returning 11792 1727096170.28353: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-d9c7-3fc0-000000000a2f] 11792 1727096170.28356: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a2f 11792 1727096170.28693: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a2f 11792 1727096170.28696: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 11792 1727096170.28803: no more pending results, returning what we have 11792 1727096170.28807: results queue empty 11792 1727096170.28807: checking for any_errors_fatal 11792 1727096170.28820: done checking for any_errors_fatal 11792 1727096170.28821: checking for max_fail_percentage 11792 1727096170.28823: done checking for max_fail_percentage 11792 1727096170.28824: checking to see if all hosts have failed and the running result is not ok 11792 1727096170.28825: done checking to see if all hosts have failed 11792 1727096170.28825: getting the remaining hosts for this loop 11792 1727096170.28827: done getting the remaining hosts for this loop 11792 1727096170.28830: getting the next task for host managed_node2 11792 1727096170.28838: done getting next task for host managed_node2 11792 1727096170.28959: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11792 1727096170.28965: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096170.28980: getting variables 11792 1727096170.28982: in VariableManager get_vars() 11792 1727096170.29027: Calling all_inventory to load vars for managed_node2 11792 1727096170.29030: Calling groups_inventory to load vars for managed_node2 11792 1727096170.29033: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096170.29043: Calling all_plugins_play to load vars for managed_node2 11792 1727096170.29046: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096170.29048: Calling groups_plugins_play to load vars for managed_node2 11792 1727096170.32170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.35376: done with get_vars() 11792 1727096170.35415: done getting variables 11792 1727096170.35478: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:56:10 -0400 (0:00:00.102) 0:00:52.635 ****** 11792 1727096170.35641: entering _queue_task() for managed_node2/fail 11792 1727096170.36314: worker is 1 (out of 1 available) 11792 1727096170.36326: exiting _queue_task() for managed_node2/fail 11792 1727096170.36337: done queuing things up, now waiting for results queue to drain 11792 1727096170.36338: waiting for pending results... 11792 1727096170.36729: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11792 1727096170.37030: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a30 11792 1727096170.37047: variable 'ansible_search_path' from source: unknown 11792 1727096170.37051: variable 'ansible_search_path' from source: unknown 11792 1727096170.37158: calling self._execute() 11792 1727096170.37387: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.37391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.37402: variable 'omit' from source: magic vars 11792 1727096170.37973: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.38189: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096170.38318: variable 'network_state' from source: role '' defaults 11792 1727096170.38353: Evaluated conditional (network_state != {}): False 11792 1727096170.38357: when evaluation is False, skipping this task 11792 1727096170.38359: _execute() done 11792 1727096170.38361: dumping result to json 11792 1727096170.38364: done dumping result, returning 11792 1727096170.38460: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-d9c7-3fc0-000000000a30] 11792 1727096170.38465: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a30 11792 1727096170.38654: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a30 11792 1727096170.38657: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096170.38729: no more pending results, returning what we have 11792 1727096170.38732: results queue empty 11792 1727096170.38733: checking for any_errors_fatal 11792 1727096170.38741: done checking for any_errors_fatal 11792 1727096170.38742: checking for max_fail_percentage 11792 1727096170.38744: done checking for max_fail_percentage 11792 1727096170.38745: checking to see if all hosts have failed and the running result is not ok 11792 1727096170.38745: done checking to see if all hosts have failed 11792 1727096170.38746: getting the remaining hosts for this loop 11792 1727096170.38747: done getting the remaining hosts for this loop 11792 1727096170.38751: getting the next task for host managed_node2 11792 1727096170.38759: done getting next task for host managed_node2 11792 1727096170.38763: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11792 1727096170.38770: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096170.38893: getting variables 11792 1727096170.38895: in VariableManager get_vars() 11792 1727096170.38938: Calling all_inventory to load vars for managed_node2 11792 1727096170.38941: Calling groups_inventory to load vars for managed_node2 11792 1727096170.38943: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096170.38954: Calling all_plugins_play to load vars for managed_node2 11792 1727096170.38957: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096170.38960: Calling groups_plugins_play to load vars for managed_node2 11792 1727096170.41005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.42699: done with get_vars() 11792 1727096170.42738: done getting variables 11792 1727096170.42863: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:56:10 -0400 (0:00:00.072) 0:00:52.708 ****** 11792 1727096170.42908: entering _queue_task() for managed_node2/fail 11792 1727096170.43666: worker is 1 (out of 1 available) 11792 1727096170.43983: exiting _queue_task() for managed_node2/fail 11792 1727096170.43996: done queuing things up, now waiting for results queue to drain 11792 1727096170.43998: waiting for pending results... 11792 1727096170.45027: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11792 1727096170.45160: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a31 11792 1727096170.45251: variable 'ansible_search_path' from source: unknown 11792 1727096170.45345: variable 'ansible_search_path' from source: unknown 11792 1727096170.45556: calling self._execute() 11792 1727096170.45758: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.45974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.45978: variable 'omit' from source: magic vars 11792 1727096170.47047: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.47146: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096170.47774: variable 'network_state' from source: role '' defaults 11792 1727096170.47780: Evaluated conditional (network_state != {}): False 11792 1727096170.47783: when evaluation is False, skipping this task 11792 1727096170.47786: _execute() done 11792 1727096170.47790: dumping result to json 11792 1727096170.47792: done dumping result, returning 11792 1727096170.47795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-d9c7-3fc0-000000000a31] 11792 1727096170.47799: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a31 11792 1727096170.47886: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a31 11792 1727096170.47890: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096170.47944: no more pending results, returning what we have 11792 1727096170.47948: results queue empty 11792 1727096170.47949: checking for any_errors_fatal 11792 1727096170.47962: done checking for any_errors_fatal 11792 1727096170.47963: checking for max_fail_percentage 11792 1727096170.47965: done checking for max_fail_percentage 11792 1727096170.47966: checking to see if all hosts have failed and the running result is not ok 11792 1727096170.47966: done checking to see if all hosts have failed 11792 1727096170.47969: getting the remaining hosts for this loop 11792 1727096170.47971: done getting the remaining hosts for this loop 11792 1727096170.47975: getting the next task for host managed_node2 11792 1727096170.47983: done getting next task for host managed_node2 11792 1727096170.47987: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11792 1727096170.47994: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096170.48017: getting variables 11792 1727096170.48019: in VariableManager get_vars() 11792 1727096170.48378: Calling all_inventory to load vars for managed_node2 11792 1727096170.48382: Calling groups_inventory to load vars for managed_node2 11792 1727096170.48385: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096170.48399: Calling all_plugins_play to load vars for managed_node2 11792 1727096170.48402: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096170.48405: Calling groups_plugins_play to load vars for managed_node2 11792 1727096170.51284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.54780: done with get_vars() 11792 1727096170.54819: done getting variables 11792 1727096170.54891: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:56:10 -0400 (0:00:00.120) 0:00:52.828 ****** 11792 1727096170.54928: entering _queue_task() for managed_node2/fail 11792 1727096170.55713: worker is 1 (out of 1 available) 11792 1727096170.55726: exiting _queue_task() for managed_node2/fail 11792 1727096170.55737: done queuing things up, now waiting for results queue to drain 11792 1727096170.55739: waiting for pending results... 11792 1727096170.56280: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11792 1727096170.56286: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a32 11792 1727096170.56292: variable 'ansible_search_path' from source: unknown 11792 1727096170.56296: variable 'ansible_search_path' from source: unknown 11792 1727096170.56376: calling self._execute() 11792 1727096170.56427: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.56433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.56443: variable 'omit' from source: magic vars 11792 1727096170.56837: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.56854: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096170.57041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096170.60363: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096170.60370: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096170.60374: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096170.60377: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096170.60379: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096170.60576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.60580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.60583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.60586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.60589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.60663: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.60687: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11792 1727096170.60899: variable 'ansible_distribution' from source: facts 11792 1727096170.60902: variable '__network_rh_distros' from source: role '' defaults 11792 1727096170.60905: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11792 1727096170.61205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.61309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.61341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.61386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.61400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.61467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.61479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.61507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.61549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.61569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.61609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.61632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.61663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.61703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.61716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.62056: variable 'network_connections' from source: task vars 11792 1727096170.62071: variable 'controller_profile' from source: play vars 11792 1727096170.62144: variable 'controller_profile' from source: play vars 11792 1727096170.62154: variable 'controller_device' from source: play vars 11792 1727096170.62222: variable 'controller_device' from source: play vars 11792 1727096170.62230: variable 'dhcp_interface1' from source: play vars 11792 1727096170.62298: variable 'dhcp_interface1' from source: play vars 11792 1727096170.62308: variable 'port1_profile' from source: play vars 11792 1727096170.62380: variable 'port1_profile' from source: play vars 11792 1727096170.62383: variable 'dhcp_interface1' from source: play vars 11792 1727096170.62433: variable 'dhcp_interface1' from source: play vars 11792 1727096170.62439: variable 'controller_profile' from source: play vars 11792 1727096170.62499: variable 'controller_profile' from source: play vars 11792 1727096170.62506: variable 'port2_profile' from source: play vars 11792 1727096170.62638: variable 'port2_profile' from source: play vars 11792 1727096170.62641: variable 'dhcp_interface2' from source: play vars 11792 1727096170.62643: variable 'dhcp_interface2' from source: play vars 11792 1727096170.62654: variable 'controller_profile' from source: play vars 11792 1727096170.62700: variable 'controller_profile' from source: play vars 11792 1727096170.62709: variable 'network_state' from source: role '' defaults 11792 1727096170.62782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096170.62961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096170.63001: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096170.63032: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096170.63064: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096170.63111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096170.63133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096170.63159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.63198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096170.63312: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11792 1727096170.63316: when evaluation is False, skipping this task 11792 1727096170.63318: _execute() done 11792 1727096170.63320: dumping result to json 11792 1727096170.63322: done dumping result, returning 11792 1727096170.63392: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-d9c7-3fc0-000000000a32] 11792 1727096170.63395: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a32 11792 1727096170.63465: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a32 11792 1727096170.63541: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11792 1727096170.63600: no more pending results, returning what we have 11792 1727096170.63604: results queue empty 11792 1727096170.63605: checking for any_errors_fatal 11792 1727096170.63613: done checking for any_errors_fatal 11792 1727096170.63614: checking for max_fail_percentage 11792 1727096170.63616: done checking for max_fail_percentage 11792 1727096170.63617: checking to see if all hosts have failed and the running result is not ok 11792 1727096170.63618: done checking to see if all hosts have failed 11792 1727096170.63619: getting the remaining hosts for this loop 11792 1727096170.63620: done getting the remaining hosts for this loop 11792 1727096170.63624: getting the next task for host managed_node2 11792 1727096170.63633: done getting next task for host managed_node2 11792 1727096170.63637: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11792 1727096170.63642: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096170.63666: getting variables 11792 1727096170.63771: in VariableManager get_vars() 11792 1727096170.63825: Calling all_inventory to load vars for managed_node2 11792 1727096170.63830: Calling groups_inventory to load vars for managed_node2 11792 1727096170.63832: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096170.63843: Calling all_plugins_play to load vars for managed_node2 11792 1727096170.63846: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096170.63850: Calling groups_plugins_play to load vars for managed_node2 11792 1727096170.65777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.67471: done with get_vars() 11792 1727096170.67505: done getting variables 11792 1727096170.67588: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:56:10 -0400 (0:00:00.126) 0:00:52.955 ****** 11792 1727096170.67628: entering _queue_task() for managed_node2/dnf 11792 1727096170.68013: worker is 1 (out of 1 available) 11792 1727096170.68027: exiting _queue_task() for managed_node2/dnf 11792 1727096170.68040: done queuing things up, now waiting for results queue to drain 11792 1727096170.68042: waiting for pending results... 11792 1727096170.68370: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11792 1727096170.68575: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a33 11792 1727096170.68579: variable 'ansible_search_path' from source: unknown 11792 1727096170.68582: variable 'ansible_search_path' from source: unknown 11792 1727096170.68585: calling self._execute() 11792 1727096170.68680: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.68684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.68694: variable 'omit' from source: magic vars 11792 1727096170.69200: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.69219: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096170.69526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096170.72323: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096170.72390: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096170.72437: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096170.72472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096170.72503: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096170.72588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.72639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.72773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.72777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.72780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.72854: variable 'ansible_distribution' from source: facts 11792 1727096170.72861: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.72879: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11792 1727096170.73002: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096170.73139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.73173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.73197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.73233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.73247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.73296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.73317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.73340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.73388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.73402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.73439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.73572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.73576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.73578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.73581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.73712: variable 'network_connections' from source: task vars 11792 1727096170.73725: variable 'controller_profile' from source: play vars 11792 1727096170.73788: variable 'controller_profile' from source: play vars 11792 1727096170.73803: variable 'controller_device' from source: play vars 11792 1727096170.73865: variable 'controller_device' from source: play vars 11792 1727096170.73874: variable 'dhcp_interface1' from source: play vars 11792 1727096170.73940: variable 'dhcp_interface1' from source: play vars 11792 1727096170.73949: variable 'port1_profile' from source: play vars 11792 1727096170.74011: variable 'port1_profile' from source: play vars 11792 1727096170.74025: variable 'dhcp_interface1' from source: play vars 11792 1727096170.74086: variable 'dhcp_interface1' from source: play vars 11792 1727096170.74092: variable 'controller_profile' from source: play vars 11792 1727096170.74154: variable 'controller_profile' from source: play vars 11792 1727096170.74165: variable 'port2_profile' from source: play vars 11792 1727096170.74221: variable 'port2_profile' from source: play vars 11792 1727096170.74229: variable 'dhcp_interface2' from source: play vars 11792 1727096170.74296: variable 'dhcp_interface2' from source: play vars 11792 1727096170.74309: variable 'controller_profile' from source: play vars 11792 1727096170.74566: variable 'controller_profile' from source: play vars 11792 1727096170.74573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096170.74680: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096170.74725: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096170.74763: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096170.74805: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096170.74872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096170.74927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096170.74957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.74993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096170.75090: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096170.75364: variable 'network_connections' from source: task vars 11792 1727096170.75377: variable 'controller_profile' from source: play vars 11792 1727096170.75444: variable 'controller_profile' from source: play vars 11792 1727096170.75456: variable 'controller_device' from source: play vars 11792 1727096170.75520: variable 'controller_device' from source: play vars 11792 1727096170.75534: variable 'dhcp_interface1' from source: play vars 11792 1727096170.75729: variable 'dhcp_interface1' from source: play vars 11792 1727096170.75731: variable 'port1_profile' from source: play vars 11792 1727096170.75968: variable 'port1_profile' from source: play vars 11792 1727096170.75984: variable 'dhcp_interface1' from source: play vars 11792 1727096170.76040: variable 'dhcp_interface1' from source: play vars 11792 1727096170.76046: variable 'controller_profile' from source: play vars 11792 1727096170.76112: variable 'controller_profile' from source: play vars 11792 1727096170.76119: variable 'port2_profile' from source: play vars 11792 1727096170.76179: variable 'port2_profile' from source: play vars 11792 1727096170.76185: variable 'dhcp_interface2' from source: play vars 11792 1727096170.76251: variable 'dhcp_interface2' from source: play vars 11792 1727096170.76261: variable 'controller_profile' from source: play vars 11792 1727096170.76372: variable 'controller_profile' from source: play vars 11792 1727096170.76375: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096170.76377: when evaluation is False, skipping this task 11792 1727096170.76380: _execute() done 11792 1727096170.76382: dumping result to json 11792 1727096170.76384: done dumping result, returning 11792 1727096170.76387: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000a33] 11792 1727096170.76388: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a33 11792 1727096170.76488: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a33 11792 1727096170.76491: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096170.76570: no more pending results, returning what we have 11792 1727096170.76574: results queue empty 11792 1727096170.76574: checking for any_errors_fatal 11792 1727096170.76583: done checking for any_errors_fatal 11792 1727096170.76583: checking for max_fail_percentage 11792 1727096170.76585: done checking for max_fail_percentage 11792 1727096170.76586: checking to see if all hosts have failed and the running result is not ok 11792 1727096170.76587: done checking to see if all hosts have failed 11792 1727096170.76588: getting the remaining hosts for this loop 11792 1727096170.76589: done getting the remaining hosts for this loop 11792 1727096170.76593: getting the next task for host managed_node2 11792 1727096170.76602: done getting next task for host managed_node2 11792 1727096170.76605: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11792 1727096170.76610: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096170.76634: getting variables 11792 1727096170.76636: in VariableManager get_vars() 11792 1727096170.76898: Calling all_inventory to load vars for managed_node2 11792 1727096170.76903: Calling groups_inventory to load vars for managed_node2 11792 1727096170.76905: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096170.76914: Calling all_plugins_play to load vars for managed_node2 11792 1727096170.76917: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096170.76920: Calling groups_plugins_play to load vars for managed_node2 11792 1727096170.78364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.80244: done with get_vars() 11792 1727096170.80272: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11792 1727096170.80343: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:56:10 -0400 (0:00:00.127) 0:00:53.083 ****** 11792 1727096170.80389: entering _queue_task() for managed_node2/yum 11792 1727096170.80760: worker is 1 (out of 1 available) 11792 1727096170.80976: exiting _queue_task() for managed_node2/yum 11792 1727096170.80989: done queuing things up, now waiting for results queue to drain 11792 1727096170.80990: waiting for pending results... 11792 1727096170.81288: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11792 1727096170.81293: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a34 11792 1727096170.81297: variable 'ansible_search_path' from source: unknown 11792 1727096170.81299: variable 'ansible_search_path' from source: unknown 11792 1727096170.81317: calling self._execute() 11792 1727096170.81419: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.81425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.81440: variable 'omit' from source: magic vars 11792 1727096170.81840: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.81857: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096170.82041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096170.84510: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096170.84619: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096170.84683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096170.84723: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096170.84757: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096170.84905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.84983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.84992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.85178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.85180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.85435: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.85673: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11792 1727096170.85677: when evaluation is False, skipping this task 11792 1727096170.85679: _execute() done 11792 1727096170.85681: dumping result to json 11792 1727096170.85683: done dumping result, returning 11792 1727096170.85686: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000a34] 11792 1727096170.85688: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a34 11792 1727096170.85761: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a34 11792 1727096170.85764: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11792 1727096170.85814: no more pending results, returning what we have 11792 1727096170.85817: results queue empty 11792 1727096170.85818: checking for any_errors_fatal 11792 1727096170.85823: done checking for any_errors_fatal 11792 1727096170.85823: checking for max_fail_percentage 11792 1727096170.85825: done checking for max_fail_percentage 11792 1727096170.85826: checking to see if all hosts have failed and the running result is not ok 11792 1727096170.85826: done checking to see if all hosts have failed 11792 1727096170.85827: getting the remaining hosts for this loop 11792 1727096170.85829: done getting the remaining hosts for this loop 11792 1727096170.85832: getting the next task for host managed_node2 11792 1727096170.85839: done getting next task for host managed_node2 11792 1727096170.85843: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11792 1727096170.85847: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096170.85870: getting variables 11792 1727096170.85872: in VariableManager get_vars() 11792 1727096170.85912: Calling all_inventory to load vars for managed_node2 11792 1727096170.85915: Calling groups_inventory to load vars for managed_node2 11792 1727096170.85917: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096170.85925: Calling all_plugins_play to load vars for managed_node2 11792 1727096170.85927: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096170.85930: Calling groups_plugins_play to load vars for managed_node2 11792 1727096170.87420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.88943: done with get_vars() 11792 1727096170.88980: done getting variables 11792 1727096170.89045: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:56:10 -0400 (0:00:00.086) 0:00:53.170 ****** 11792 1727096170.89090: entering _queue_task() for managed_node2/fail 11792 1727096170.89481: worker is 1 (out of 1 available) 11792 1727096170.89495: exiting _queue_task() for managed_node2/fail 11792 1727096170.89508: done queuing things up, now waiting for results queue to drain 11792 1727096170.89509: waiting for pending results... 11792 1727096170.89831: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11792 1727096170.89981: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a35 11792 1727096170.90005: variable 'ansible_search_path' from source: unknown 11792 1727096170.90009: variable 'ansible_search_path' from source: unknown 11792 1727096170.90045: calling self._execute() 11792 1727096170.90157: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096170.90168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096170.90180: variable 'omit' from source: magic vars 11792 1727096170.90619: variable 'ansible_distribution_major_version' from source: facts 11792 1727096170.90631: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096170.90777: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096170.90997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096170.93343: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096170.93415: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096170.93723: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096170.93727: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096170.93730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096170.93733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.93736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.93739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.93742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.93760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.93817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.93841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.93870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.93914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.93930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.93974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096170.93998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096170.94030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.94071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096170.94086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096170.94285: variable 'network_connections' from source: task vars 11792 1727096170.94299: variable 'controller_profile' from source: play vars 11792 1727096170.94383: variable 'controller_profile' from source: play vars 11792 1727096170.94391: variable 'controller_device' from source: play vars 11792 1727096170.94451: variable 'controller_device' from source: play vars 11792 1727096170.94471: variable 'dhcp_interface1' from source: play vars 11792 1727096170.94528: variable 'dhcp_interface1' from source: play vars 11792 1727096170.94538: variable 'port1_profile' from source: play vars 11792 1727096170.94607: variable 'port1_profile' from source: play vars 11792 1727096170.94614: variable 'dhcp_interface1' from source: play vars 11792 1727096170.94681: variable 'dhcp_interface1' from source: play vars 11792 1727096170.94687: variable 'controller_profile' from source: play vars 11792 1727096170.94743: variable 'controller_profile' from source: play vars 11792 1727096170.94750: variable 'port2_profile' from source: play vars 11792 1727096170.94817: variable 'port2_profile' from source: play vars 11792 1727096170.94821: variable 'dhcp_interface2' from source: play vars 11792 1727096170.94878: variable 'dhcp_interface2' from source: play vars 11792 1727096170.94886: variable 'controller_profile' from source: play vars 11792 1727096170.94943: variable 'controller_profile' from source: play vars 11792 1727096170.95024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096170.95213: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096170.95252: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096170.95361: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096170.95364: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096170.95367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096170.95390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096170.95423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096170.95450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096170.95525: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096170.96198: variable 'network_connections' from source: task vars 11792 1727096170.96202: variable 'controller_profile' from source: play vars 11792 1727096170.96264: variable 'controller_profile' from source: play vars 11792 1727096170.96336: variable 'controller_device' from source: play vars 11792 1727096170.96339: variable 'controller_device' from source: play vars 11792 1727096170.96342: variable 'dhcp_interface1' from source: play vars 11792 1727096170.96408: variable 'dhcp_interface1' from source: play vars 11792 1727096170.96415: variable 'port1_profile' from source: play vars 11792 1727096170.96477: variable 'port1_profile' from source: play vars 11792 1727096170.96484: variable 'dhcp_interface1' from source: play vars 11792 1727096170.96551: variable 'dhcp_interface1' from source: play vars 11792 1727096170.96554: variable 'controller_profile' from source: play vars 11792 1727096170.96621: variable 'controller_profile' from source: play vars 11792 1727096170.96628: variable 'port2_profile' from source: play vars 11792 1727096170.96763: variable 'port2_profile' from source: play vars 11792 1727096170.96773: variable 'dhcp_interface2' from source: play vars 11792 1727096170.96776: variable 'dhcp_interface2' from source: play vars 11792 1727096170.96778: variable 'controller_profile' from source: play vars 11792 1727096170.96853: variable 'controller_profile' from source: play vars 11792 1727096170.96881: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096170.96884: when evaluation is False, skipping this task 11792 1727096170.96886: _execute() done 11792 1727096170.96889: dumping result to json 11792 1727096170.96891: done dumping result, returning 11792 1727096170.96900: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000a35] 11792 1727096170.96903: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a35 11792 1727096170.97004: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a35 11792 1727096170.97008: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096170.97066: no more pending results, returning what we have 11792 1727096170.97071: results queue empty 11792 1727096170.97072: checking for any_errors_fatal 11792 1727096170.97079: done checking for any_errors_fatal 11792 1727096170.97080: checking for max_fail_percentage 11792 1727096170.97081: done checking for max_fail_percentage 11792 1727096170.97082: checking to see if all hosts have failed and the running result is not ok 11792 1727096170.97083: done checking to see if all hosts have failed 11792 1727096170.97083: getting the remaining hosts for this loop 11792 1727096170.97085: done getting the remaining hosts for this loop 11792 1727096170.97088: getting the next task for host managed_node2 11792 1727096170.97095: done getting next task for host managed_node2 11792 1727096170.97099: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11792 1727096170.97104: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096170.97122: getting variables 11792 1727096170.97124: in VariableManager get_vars() 11792 1727096170.97174: Calling all_inventory to load vars for managed_node2 11792 1727096170.97177: Calling groups_inventory to load vars for managed_node2 11792 1727096170.97179: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096170.97189: Calling all_plugins_play to load vars for managed_node2 11792 1727096170.97192: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096170.97194: Calling groups_plugins_play to load vars for managed_node2 11792 1727096170.98185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096170.99505: done with get_vars() 11792 1727096170.99543: done getting variables 11792 1727096170.99610: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:56:10 -0400 (0:00:00.105) 0:00:53.275 ****** 11792 1727096170.99663: entering _queue_task() for managed_node2/package 11792 1727096170.99947: worker is 1 (out of 1 available) 11792 1727096170.99959: exiting _queue_task() for managed_node2/package 11792 1727096170.99976: done queuing things up, now waiting for results queue to drain 11792 1727096170.99978: waiting for pending results... 11792 1727096171.00163: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11792 1727096171.00285: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a36 11792 1727096171.00298: variable 'ansible_search_path' from source: unknown 11792 1727096171.00301: variable 'ansible_search_path' from source: unknown 11792 1727096171.00333: calling self._execute() 11792 1727096171.00409: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096171.00413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096171.00424: variable 'omit' from source: magic vars 11792 1727096171.00710: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.00719: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096171.00862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096171.01059: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096171.01098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096171.01123: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096171.01179: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096171.01261: variable 'network_packages' from source: role '' defaults 11792 1727096171.01339: variable '__network_provider_setup' from source: role '' defaults 11792 1727096171.01348: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096171.01399: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096171.01410: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096171.01451: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096171.01571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096171.03506: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096171.03549: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096171.03581: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096171.03608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096171.03628: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096171.03691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.03712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.03731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.03760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.03775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.03806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.03825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.03842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.03875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.03886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.04040: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11792 1727096171.04119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.04135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.04156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.04185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.04195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.04255: variable 'ansible_python' from source: facts 11792 1727096171.04274: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11792 1727096171.04331: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096171.04390: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096171.04475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.04493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.04511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.04536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.04546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.04584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.04604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.04622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.04649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.04721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.04852: variable 'network_connections' from source: task vars 11792 1727096171.04855: variable 'controller_profile' from source: play vars 11792 1727096171.05077: variable 'controller_profile' from source: play vars 11792 1727096171.05081: variable 'controller_device' from source: play vars 11792 1727096171.05084: variable 'controller_device' from source: play vars 11792 1727096171.05086: variable 'dhcp_interface1' from source: play vars 11792 1727096171.05145: variable 'dhcp_interface1' from source: play vars 11792 1727096171.05159: variable 'port1_profile' from source: play vars 11792 1727096171.05262: variable 'port1_profile' from source: play vars 11792 1727096171.05272: variable 'dhcp_interface1' from source: play vars 11792 1727096171.05371: variable 'dhcp_interface1' from source: play vars 11792 1727096171.05380: variable 'controller_profile' from source: play vars 11792 1727096171.05479: variable 'controller_profile' from source: play vars 11792 1727096171.05487: variable 'port2_profile' from source: play vars 11792 1727096171.05590: variable 'port2_profile' from source: play vars 11792 1727096171.05594: variable 'dhcp_interface2' from source: play vars 11792 1727096171.05703: variable 'dhcp_interface2' from source: play vars 11792 1727096171.05706: variable 'controller_profile' from source: play vars 11792 1727096171.05792: variable 'controller_profile' from source: play vars 11792 1727096171.05876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096171.05895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096171.05947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.05950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096171.06006: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096171.06278: variable 'network_connections' from source: task vars 11792 1727096171.06285: variable 'controller_profile' from source: play vars 11792 1727096171.06385: variable 'controller_profile' from source: play vars 11792 1727096171.06575: variable 'controller_device' from source: play vars 11792 1727096171.06580: variable 'controller_device' from source: play vars 11792 1727096171.06583: variable 'dhcp_interface1' from source: play vars 11792 1727096171.06585: variable 'dhcp_interface1' from source: play vars 11792 1727096171.06588: variable 'port1_profile' from source: play vars 11792 1727096171.06670: variable 'port1_profile' from source: play vars 11792 1727096171.06678: variable 'dhcp_interface1' from source: play vars 11792 1727096171.06782: variable 'dhcp_interface1' from source: play vars 11792 1727096171.06790: variable 'controller_profile' from source: play vars 11792 1727096171.06861: variable 'controller_profile' from source: play vars 11792 1727096171.06870: variable 'port2_profile' from source: play vars 11792 1727096171.06945: variable 'port2_profile' from source: play vars 11792 1727096171.06953: variable 'dhcp_interface2' from source: play vars 11792 1727096171.07021: variable 'dhcp_interface2' from source: play vars 11792 1727096171.07036: variable 'controller_profile' from source: play vars 11792 1727096171.07101: variable 'controller_profile' from source: play vars 11792 1727096171.07144: variable '__network_packages_default_wireless' from source: role '' defaults 11792 1727096171.07201: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096171.07399: variable 'network_connections' from source: task vars 11792 1727096171.07402: variable 'controller_profile' from source: play vars 11792 1727096171.07449: variable 'controller_profile' from source: play vars 11792 1727096171.07454: variable 'controller_device' from source: play vars 11792 1727096171.07505: variable 'controller_device' from source: play vars 11792 1727096171.07512: variable 'dhcp_interface1' from source: play vars 11792 1727096171.07559: variable 'dhcp_interface1' from source: play vars 11792 1727096171.07566: variable 'port1_profile' from source: play vars 11792 1727096171.07614: variable 'port1_profile' from source: play vars 11792 1727096171.07620: variable 'dhcp_interface1' from source: play vars 11792 1727096171.07665: variable 'dhcp_interface1' from source: play vars 11792 1727096171.07671: variable 'controller_profile' from source: play vars 11792 1727096171.07718: variable 'controller_profile' from source: play vars 11792 1727096171.07724: variable 'port2_profile' from source: play vars 11792 1727096171.07771: variable 'port2_profile' from source: play vars 11792 1727096171.07777: variable 'dhcp_interface2' from source: play vars 11792 1727096171.07823: variable 'dhcp_interface2' from source: play vars 11792 1727096171.07829: variable 'controller_profile' from source: play vars 11792 1727096171.07876: variable 'controller_profile' from source: play vars 11792 1727096171.07896: variable '__network_packages_default_team' from source: role '' defaults 11792 1727096171.07951: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096171.08150: variable 'network_connections' from source: task vars 11792 1727096171.08153: variable 'controller_profile' from source: play vars 11792 1727096171.08201: variable 'controller_profile' from source: play vars 11792 1727096171.08206: variable 'controller_device' from source: play vars 11792 1727096171.08254: variable 'controller_device' from source: play vars 11792 1727096171.08263: variable 'dhcp_interface1' from source: play vars 11792 1727096171.08310: variable 'dhcp_interface1' from source: play vars 11792 1727096171.08317: variable 'port1_profile' from source: play vars 11792 1727096171.08365: variable 'port1_profile' from source: play vars 11792 1727096171.08373: variable 'dhcp_interface1' from source: play vars 11792 1727096171.08416: variable 'dhcp_interface1' from source: play vars 11792 1727096171.08421: variable 'controller_profile' from source: play vars 11792 1727096171.08471: variable 'controller_profile' from source: play vars 11792 1727096171.08477: variable 'port2_profile' from source: play vars 11792 1727096171.08520: variable 'port2_profile' from source: play vars 11792 1727096171.08526: variable 'dhcp_interface2' from source: play vars 11792 1727096171.08647: variable 'dhcp_interface2' from source: play vars 11792 1727096171.08650: variable 'controller_profile' from source: play vars 11792 1727096171.08652: variable 'controller_profile' from source: play vars 11792 1727096171.08803: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096171.08806: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096171.08812: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096171.09073: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096171.09076: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11792 1727096171.09540: variable 'network_connections' from source: task vars 11792 1727096171.09544: variable 'controller_profile' from source: play vars 11792 1727096171.09605: variable 'controller_profile' from source: play vars 11792 1727096171.09612: variable 'controller_device' from source: play vars 11792 1727096171.09671: variable 'controller_device' from source: play vars 11792 1727096171.09681: variable 'dhcp_interface1' from source: play vars 11792 1727096171.09740: variable 'dhcp_interface1' from source: play vars 11792 1727096171.09748: variable 'port1_profile' from source: play vars 11792 1727096171.09806: variable 'port1_profile' from source: play vars 11792 1727096171.09812: variable 'dhcp_interface1' from source: play vars 11792 1727096171.09855: variable 'dhcp_interface1' from source: play vars 11792 1727096171.09863: variable 'controller_profile' from source: play vars 11792 1727096171.09942: variable 'controller_profile' from source: play vars 11792 1727096171.09954: variable 'port2_profile' from source: play vars 11792 1727096171.09995: variable 'port2_profile' from source: play vars 11792 1727096171.10001: variable 'dhcp_interface2' from source: play vars 11792 1727096171.10047: variable 'dhcp_interface2' from source: play vars 11792 1727096171.10051: variable 'controller_profile' from source: play vars 11792 1727096171.10101: variable 'controller_profile' from source: play vars 11792 1727096171.10108: variable 'ansible_distribution' from source: facts 11792 1727096171.10111: variable '__network_rh_distros' from source: role '' defaults 11792 1727096171.10116: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.10136: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11792 1727096171.10246: variable 'ansible_distribution' from source: facts 11792 1727096171.10249: variable '__network_rh_distros' from source: role '' defaults 11792 1727096171.10254: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.10269: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11792 1727096171.10376: variable 'ansible_distribution' from source: facts 11792 1727096171.10380: variable '__network_rh_distros' from source: role '' defaults 11792 1727096171.10386: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.10414: variable 'network_provider' from source: set_fact 11792 1727096171.10425: variable 'ansible_facts' from source: unknown 11792 1727096171.10861: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11792 1727096171.10864: when evaluation is False, skipping this task 11792 1727096171.10867: _execute() done 11792 1727096171.10871: dumping result to json 11792 1727096171.10873: done dumping result, returning 11792 1727096171.10881: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-d9c7-3fc0-000000000a36] 11792 1727096171.10885: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a36 11792 1727096171.10976: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a36 11792 1727096171.10978: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11792 1727096171.11025: no more pending results, returning what we have 11792 1727096171.11030: results queue empty 11792 1727096171.11030: checking for any_errors_fatal 11792 1727096171.11036: done checking for any_errors_fatal 11792 1727096171.11037: checking for max_fail_percentage 11792 1727096171.11039: done checking for max_fail_percentage 11792 1727096171.11039: checking to see if all hosts have failed and the running result is not ok 11792 1727096171.11040: done checking to see if all hosts have failed 11792 1727096171.11040: getting the remaining hosts for this loop 11792 1727096171.11042: done getting the remaining hosts for this loop 11792 1727096171.11045: getting the next task for host managed_node2 11792 1727096171.11052: done getting next task for host managed_node2 11792 1727096171.11055: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11792 1727096171.11060: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096171.11082: getting variables 11792 1727096171.11083: in VariableManager get_vars() 11792 1727096171.11125: Calling all_inventory to load vars for managed_node2 11792 1727096171.11128: Calling groups_inventory to load vars for managed_node2 11792 1727096171.11130: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096171.11139: Calling all_plugins_play to load vars for managed_node2 11792 1727096171.11142: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096171.11145: Calling groups_plugins_play to load vars for managed_node2 11792 1727096171.11967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096171.12847: done with get_vars() 11792 1727096171.12872: done getting variables 11792 1727096171.12920: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:56:11 -0400 (0:00:00.132) 0:00:53.408 ****** 11792 1727096171.12947: entering _queue_task() for managed_node2/package 11792 1727096171.13213: worker is 1 (out of 1 available) 11792 1727096171.13226: exiting _queue_task() for managed_node2/package 11792 1727096171.13239: done queuing things up, now waiting for results queue to drain 11792 1727096171.13241: waiting for pending results... 11792 1727096171.13429: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11792 1727096171.13530: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a37 11792 1727096171.13543: variable 'ansible_search_path' from source: unknown 11792 1727096171.13546: variable 'ansible_search_path' from source: unknown 11792 1727096171.13585: calling self._execute() 11792 1727096171.13655: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096171.13660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096171.13665: variable 'omit' from source: magic vars 11792 1727096171.13945: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.13956: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096171.14039: variable 'network_state' from source: role '' defaults 11792 1727096171.14047: Evaluated conditional (network_state != {}): False 11792 1727096171.14050: when evaluation is False, skipping this task 11792 1727096171.14056: _execute() done 11792 1727096171.14058: dumping result to json 11792 1727096171.14061: done dumping result, returning 11792 1727096171.14066: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-d9c7-3fc0-000000000a37] 11792 1727096171.14072: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a37 11792 1727096171.14165: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a37 11792 1727096171.14170: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096171.14221: no more pending results, returning what we have 11792 1727096171.14225: results queue empty 11792 1727096171.14225: checking for any_errors_fatal 11792 1727096171.14232: done checking for any_errors_fatal 11792 1727096171.14233: checking for max_fail_percentage 11792 1727096171.14235: done checking for max_fail_percentage 11792 1727096171.14236: checking to see if all hosts have failed and the running result is not ok 11792 1727096171.14237: done checking to see if all hosts have failed 11792 1727096171.14237: getting the remaining hosts for this loop 11792 1727096171.14239: done getting the remaining hosts for this loop 11792 1727096171.14242: getting the next task for host managed_node2 11792 1727096171.14249: done getting next task for host managed_node2 11792 1727096171.14255: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11792 1727096171.14261: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096171.14285: getting variables 11792 1727096171.14286: in VariableManager get_vars() 11792 1727096171.14324: Calling all_inventory to load vars for managed_node2 11792 1727096171.14326: Calling groups_inventory to load vars for managed_node2 11792 1727096171.14328: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096171.14337: Calling all_plugins_play to load vars for managed_node2 11792 1727096171.14340: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096171.14342: Calling groups_plugins_play to load vars for managed_node2 11792 1727096171.15275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096171.16142: done with get_vars() 11792 1727096171.16164: done getting variables 11792 1727096171.16211: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:56:11 -0400 (0:00:00.032) 0:00:53.441 ****** 11792 1727096171.16239: entering _queue_task() for managed_node2/package 11792 1727096171.16511: worker is 1 (out of 1 available) 11792 1727096171.16524: exiting _queue_task() for managed_node2/package 11792 1727096171.16537: done queuing things up, now waiting for results queue to drain 11792 1727096171.16538: waiting for pending results... 11792 1727096171.16725: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11792 1727096171.16843: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a38 11792 1727096171.16858: variable 'ansible_search_path' from source: unknown 11792 1727096171.16861: variable 'ansible_search_path' from source: unknown 11792 1727096171.16892: calling self._execute() 11792 1727096171.16967: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096171.16973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096171.16982: variable 'omit' from source: magic vars 11792 1727096171.17259: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.17267: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096171.17350: variable 'network_state' from source: role '' defaults 11792 1727096171.17359: Evaluated conditional (network_state != {}): False 11792 1727096171.17362: when evaluation is False, skipping this task 11792 1727096171.17364: _execute() done 11792 1727096171.17369: dumping result to json 11792 1727096171.17372: done dumping result, returning 11792 1727096171.17380: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-d9c7-3fc0-000000000a38] 11792 1727096171.17385: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a38 11792 1727096171.17480: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a38 11792 1727096171.17483: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096171.17532: no more pending results, returning what we have 11792 1727096171.17535: results queue empty 11792 1727096171.17536: checking for any_errors_fatal 11792 1727096171.17545: done checking for any_errors_fatal 11792 1727096171.17545: checking for max_fail_percentage 11792 1727096171.17547: done checking for max_fail_percentage 11792 1727096171.17548: checking to see if all hosts have failed and the running result is not ok 11792 1727096171.17548: done checking to see if all hosts have failed 11792 1727096171.17549: getting the remaining hosts for this loop 11792 1727096171.17551: done getting the remaining hosts for this loop 11792 1727096171.17556: getting the next task for host managed_node2 11792 1727096171.17563: done getting next task for host managed_node2 11792 1727096171.17569: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11792 1727096171.17574: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096171.17602: getting variables 11792 1727096171.17604: in VariableManager get_vars() 11792 1727096171.17641: Calling all_inventory to load vars for managed_node2 11792 1727096171.17643: Calling groups_inventory to load vars for managed_node2 11792 1727096171.17645: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096171.17656: Calling all_plugins_play to load vars for managed_node2 11792 1727096171.17659: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096171.17662: Calling groups_plugins_play to load vars for managed_node2 11792 1727096171.18447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096171.19339: done with get_vars() 11792 1727096171.19369: done getting variables 11792 1727096171.19415: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:56:11 -0400 (0:00:00.032) 0:00:53.473 ****** 11792 1727096171.19445: entering _queue_task() for managed_node2/service 11792 1727096171.19718: worker is 1 (out of 1 available) 11792 1727096171.19730: exiting _queue_task() for managed_node2/service 11792 1727096171.19743: done queuing things up, now waiting for results queue to drain 11792 1727096171.19745: waiting for pending results... 11792 1727096171.19940: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11792 1727096171.20057: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a39 11792 1727096171.20078: variable 'ansible_search_path' from source: unknown 11792 1727096171.20083: variable 'ansible_search_path' from source: unknown 11792 1727096171.20111: calling self._execute() 11792 1727096171.20186: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096171.20190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096171.20203: variable 'omit' from source: magic vars 11792 1727096171.20480: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.20489: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096171.20577: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096171.20709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096171.22446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096171.22496: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096171.22523: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096171.22547: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096171.22571: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096171.22632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.22653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.22674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.22703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.22714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.22748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.22769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.22786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.22814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.22825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.22853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.22873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.22889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.22916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.22926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.23044: variable 'network_connections' from source: task vars 11792 1727096171.23055: variable 'controller_profile' from source: play vars 11792 1727096171.23109: variable 'controller_profile' from source: play vars 11792 1727096171.23118: variable 'controller_device' from source: play vars 11792 1727096171.23166: variable 'controller_device' from source: play vars 11792 1727096171.23174: variable 'dhcp_interface1' from source: play vars 11792 1727096171.23215: variable 'dhcp_interface1' from source: play vars 11792 1727096171.23223: variable 'port1_profile' from source: play vars 11792 1727096171.23271: variable 'port1_profile' from source: play vars 11792 1727096171.23277: variable 'dhcp_interface1' from source: play vars 11792 1727096171.23318: variable 'dhcp_interface1' from source: play vars 11792 1727096171.23323: variable 'controller_profile' from source: play vars 11792 1727096171.23371: variable 'controller_profile' from source: play vars 11792 1727096171.23376: variable 'port2_profile' from source: play vars 11792 1727096171.23429: variable 'port2_profile' from source: play vars 11792 1727096171.23435: variable 'dhcp_interface2' from source: play vars 11792 1727096171.23481: variable 'dhcp_interface2' from source: play vars 11792 1727096171.23488: variable 'controller_profile' from source: play vars 11792 1727096171.23529: variable 'controller_profile' from source: play vars 11792 1727096171.23585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096171.23694: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096171.23722: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096171.23743: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096171.23769: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096171.23801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096171.23818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096171.23836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.23860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096171.23914: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096171.24069: variable 'network_connections' from source: task vars 11792 1727096171.24073: variable 'controller_profile' from source: play vars 11792 1727096171.24116: variable 'controller_profile' from source: play vars 11792 1727096171.24121: variable 'controller_device' from source: play vars 11792 1727096171.24167: variable 'controller_device' from source: play vars 11792 1727096171.24175: variable 'dhcp_interface1' from source: play vars 11792 1727096171.24218: variable 'dhcp_interface1' from source: play vars 11792 1727096171.24221: variable 'port1_profile' from source: play vars 11792 1727096171.24269: variable 'port1_profile' from source: play vars 11792 1727096171.24274: variable 'dhcp_interface1' from source: play vars 11792 1727096171.24315: variable 'dhcp_interface1' from source: play vars 11792 1727096171.24322: variable 'controller_profile' from source: play vars 11792 1727096171.24368: variable 'controller_profile' from source: play vars 11792 1727096171.24374: variable 'port2_profile' from source: play vars 11792 1727096171.24415: variable 'port2_profile' from source: play vars 11792 1727096171.24421: variable 'dhcp_interface2' from source: play vars 11792 1727096171.24469: variable 'dhcp_interface2' from source: play vars 11792 1727096171.24547: variable 'controller_profile' from source: play vars 11792 1727096171.24551: variable 'controller_profile' from source: play vars 11792 1727096171.24553: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096171.24557: when evaluation is False, skipping this task 11792 1727096171.24559: _execute() done 11792 1727096171.24561: dumping result to json 11792 1727096171.24563: done dumping result, returning 11792 1727096171.24565: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000a39] 11792 1727096171.24566: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a39 11792 1727096171.24646: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a39 11792 1727096171.24650: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096171.24720: no more pending results, returning what we have 11792 1727096171.24724: results queue empty 11792 1727096171.24725: checking for any_errors_fatal 11792 1727096171.24732: done checking for any_errors_fatal 11792 1727096171.24733: checking for max_fail_percentage 11792 1727096171.24735: done checking for max_fail_percentage 11792 1727096171.24735: checking to see if all hosts have failed and the running result is not ok 11792 1727096171.24736: done checking to see if all hosts have failed 11792 1727096171.24737: getting the remaining hosts for this loop 11792 1727096171.24738: done getting the remaining hosts for this loop 11792 1727096171.24742: getting the next task for host managed_node2 11792 1727096171.24750: done getting next task for host managed_node2 11792 1727096171.24754: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11792 1727096171.24760: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096171.24781: getting variables 11792 1727096171.24783: in VariableManager get_vars() 11792 1727096171.24824: Calling all_inventory to load vars for managed_node2 11792 1727096171.24827: Calling groups_inventory to load vars for managed_node2 11792 1727096171.24829: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096171.24838: Calling all_plugins_play to load vars for managed_node2 11792 1727096171.24840: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096171.24842: Calling groups_plugins_play to load vars for managed_node2 11792 1727096171.25792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096171.26679: done with get_vars() 11792 1727096171.26699: done getting variables 11792 1727096171.26749: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:56:11 -0400 (0:00:00.073) 0:00:53.546 ****** 11792 1727096171.26777: entering _queue_task() for managed_node2/service 11792 1727096171.27190: worker is 1 (out of 1 available) 11792 1727096171.27320: exiting _queue_task() for managed_node2/service 11792 1727096171.27333: done queuing things up, now waiting for results queue to drain 11792 1727096171.27335: waiting for pending results... 11792 1727096171.27547: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11792 1727096171.27798: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a3a 11792 1727096171.27803: variable 'ansible_search_path' from source: unknown 11792 1727096171.27805: variable 'ansible_search_path' from source: unknown 11792 1727096171.27830: calling self._execute() 11792 1727096171.27917: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096171.27921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096171.27930: variable 'omit' from source: magic vars 11792 1727096171.28212: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.28221: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096171.28332: variable 'network_provider' from source: set_fact 11792 1727096171.28336: variable 'network_state' from source: role '' defaults 11792 1727096171.28344: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11792 1727096171.28350: variable 'omit' from source: magic vars 11792 1727096171.28406: variable 'omit' from source: magic vars 11792 1727096171.28425: variable 'network_service_name' from source: role '' defaults 11792 1727096171.28471: variable 'network_service_name' from source: role '' defaults 11792 1727096171.28546: variable '__network_provider_setup' from source: role '' defaults 11792 1727096171.28549: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096171.28599: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096171.28607: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096171.28656: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096171.28806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096171.30475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096171.30478: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096171.30515: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096171.30573: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096171.30603: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096171.30680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.30716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.30746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.30791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.30809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.30851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.30879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.30908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.30950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.30971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.31272: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11792 1727096171.31306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.31333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.31360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.31404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.31422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.31514: variable 'ansible_python' from source: facts 11792 1727096171.31535: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11792 1727096171.31619: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096171.31701: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096171.31826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.31856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.31887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.31930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.31949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.32000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.32072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.32075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.32105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.32127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.32266: variable 'network_connections' from source: task vars 11792 1727096171.32282: variable 'controller_profile' from source: play vars 11792 1727096171.32472: variable 'controller_profile' from source: play vars 11792 1727096171.32475: variable 'controller_device' from source: play vars 11792 1727096171.32477: variable 'controller_device' from source: play vars 11792 1727096171.32479: variable 'dhcp_interface1' from source: play vars 11792 1727096171.32524: variable 'dhcp_interface1' from source: play vars 11792 1727096171.32542: variable 'port1_profile' from source: play vars 11792 1727096171.32617: variable 'port1_profile' from source: play vars 11792 1727096171.32633: variable 'dhcp_interface1' from source: play vars 11792 1727096171.32707: variable 'dhcp_interface1' from source: play vars 11792 1727096171.32721: variable 'controller_profile' from source: play vars 11792 1727096171.32794: variable 'controller_profile' from source: play vars 11792 1727096171.32809: variable 'port2_profile' from source: play vars 11792 1727096171.32881: variable 'port2_profile' from source: play vars 11792 1727096171.32896: variable 'dhcp_interface2' from source: play vars 11792 1727096171.32966: variable 'dhcp_interface2' from source: play vars 11792 1727096171.32987: variable 'controller_profile' from source: play vars 11792 1727096171.33058: variable 'controller_profile' from source: play vars 11792 1727096171.33164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096171.33355: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096171.33411: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096171.33476: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096171.33519: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096171.33587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096171.33620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096171.33873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.33876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096171.33878: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096171.34036: variable 'network_connections' from source: task vars 11792 1727096171.34047: variable 'controller_profile' from source: play vars 11792 1727096171.34126: variable 'controller_profile' from source: play vars 11792 1727096171.34142: variable 'controller_device' from source: play vars 11792 1727096171.34217: variable 'controller_device' from source: play vars 11792 1727096171.34238: variable 'dhcp_interface1' from source: play vars 11792 1727096171.34309: variable 'dhcp_interface1' from source: play vars 11792 1727096171.34326: variable 'port1_profile' from source: play vars 11792 1727096171.34398: variable 'port1_profile' from source: play vars 11792 1727096171.34412: variable 'dhcp_interface1' from source: play vars 11792 1727096171.34474: variable 'dhcp_interface1' from source: play vars 11792 1727096171.34487: variable 'controller_profile' from source: play vars 11792 1727096171.34549: variable 'controller_profile' from source: play vars 11792 1727096171.34564: variable 'port2_profile' from source: play vars 11792 1727096171.34630: variable 'port2_profile' from source: play vars 11792 1727096171.34645: variable 'dhcp_interface2' from source: play vars 11792 1727096171.34724: variable 'dhcp_interface2' from source: play vars 11792 1727096171.34740: variable 'controller_profile' from source: play vars 11792 1727096171.34816: variable 'controller_profile' from source: play vars 11792 1727096171.34875: variable '__network_packages_default_wireless' from source: role '' defaults 11792 1727096171.34957: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096171.35252: variable 'network_connections' from source: task vars 11792 1727096171.35263: variable 'controller_profile' from source: play vars 11792 1727096171.35337: variable 'controller_profile' from source: play vars 11792 1727096171.35350: variable 'controller_device' from source: play vars 11792 1727096171.35424: variable 'controller_device' from source: play vars 11792 1727096171.35439: variable 'dhcp_interface1' from source: play vars 11792 1727096171.35524: variable 'dhcp_interface1' from source: play vars 11792 1727096171.35538: variable 'port1_profile' from source: play vars 11792 1727096171.35611: variable 'port1_profile' from source: play vars 11792 1727096171.35624: variable 'dhcp_interface1' from source: play vars 11792 1727096171.35696: variable 'dhcp_interface1' from source: play vars 11792 1727096171.35709: variable 'controller_profile' from source: play vars 11792 1727096171.35782: variable 'controller_profile' from source: play vars 11792 1727096171.35795: variable 'port2_profile' from source: play vars 11792 1727096171.35863: variable 'port2_profile' from source: play vars 11792 1727096171.35879: variable 'dhcp_interface2' from source: play vars 11792 1727096171.35947: variable 'dhcp_interface2' from source: play vars 11792 1727096171.35959: variable 'controller_profile' from source: play vars 11792 1727096171.36030: variable 'controller_profile' from source: play vars 11792 1727096171.36063: variable '__network_packages_default_team' from source: role '' defaults 11792 1727096171.36273: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096171.36434: variable 'network_connections' from source: task vars 11792 1727096171.36442: variable 'controller_profile' from source: play vars 11792 1727096171.36511: variable 'controller_profile' from source: play vars 11792 1727096171.36523: variable 'controller_device' from source: play vars 11792 1727096171.36595: variable 'controller_device' from source: play vars 11792 1727096171.36606: variable 'dhcp_interface1' from source: play vars 11792 1727096171.36680: variable 'dhcp_interface1' from source: play vars 11792 1727096171.36692: variable 'port1_profile' from source: play vars 11792 1727096171.36759: variable 'port1_profile' from source: play vars 11792 1727096171.36774: variable 'dhcp_interface1' from source: play vars 11792 1727096171.36834: variable 'dhcp_interface1' from source: play vars 11792 1727096171.36845: variable 'controller_profile' from source: play vars 11792 1727096171.36913: variable 'controller_profile' from source: play vars 11792 1727096171.36930: variable 'port2_profile' from source: play vars 11792 1727096171.36999: variable 'port2_profile' from source: play vars 11792 1727096171.37011: variable 'dhcp_interface2' from source: play vars 11792 1727096171.37083: variable 'dhcp_interface2' from source: play vars 11792 1727096171.37095: variable 'controller_profile' from source: play vars 11792 1727096171.37162: variable 'controller_profile' from source: play vars 11792 1727096171.37233: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096171.37297: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096171.37309: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096171.37370: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096171.37584: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11792 1727096171.38273: variable 'network_connections' from source: task vars 11792 1727096171.38276: variable 'controller_profile' from source: play vars 11792 1727096171.38278: variable 'controller_profile' from source: play vars 11792 1727096171.38280: variable 'controller_device' from source: play vars 11792 1727096171.38282: variable 'controller_device' from source: play vars 11792 1727096171.38284: variable 'dhcp_interface1' from source: play vars 11792 1727096171.38285: variable 'dhcp_interface1' from source: play vars 11792 1727096171.38287: variable 'port1_profile' from source: play vars 11792 1727096171.38327: variable 'port1_profile' from source: play vars 11792 1727096171.38339: variable 'dhcp_interface1' from source: play vars 11792 1727096171.38401: variable 'dhcp_interface1' from source: play vars 11792 1727096171.38412: variable 'controller_profile' from source: play vars 11792 1727096171.38475: variable 'controller_profile' from source: play vars 11792 1727096171.38487: variable 'port2_profile' from source: play vars 11792 1727096171.38546: variable 'port2_profile' from source: play vars 11792 1727096171.38558: variable 'dhcp_interface2' from source: play vars 11792 1727096171.38619: variable 'dhcp_interface2' from source: play vars 11792 1727096171.38631: variable 'controller_profile' from source: play vars 11792 1727096171.38693: variable 'controller_profile' from source: play vars 11792 1727096171.38706: variable 'ansible_distribution' from source: facts 11792 1727096171.38713: variable '__network_rh_distros' from source: role '' defaults 11792 1727096171.38722: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.38751: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11792 1727096171.39127: variable 'ansible_distribution' from source: facts 11792 1727096171.39135: variable '__network_rh_distros' from source: role '' defaults 11792 1727096171.39144: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.39161: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11792 1727096171.39331: variable 'ansible_distribution' from source: facts 11792 1727096171.39339: variable '__network_rh_distros' from source: role '' defaults 11792 1727096171.39348: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.39388: variable 'network_provider' from source: set_fact 11792 1727096171.39414: variable 'omit' from source: magic vars 11792 1727096171.39447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096171.39481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096171.39503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096171.39524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096171.39538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096171.39572: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096171.39581: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096171.39588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096171.39686: Set connection var ansible_timeout to 10 11792 1727096171.39699: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096171.39712: Set connection var ansible_shell_executable to /bin/sh 11792 1727096171.39721: Set connection var ansible_pipelining to False 11792 1727096171.39727: Set connection var ansible_shell_type to sh 11792 1727096171.39732: Set connection var ansible_connection to ssh 11792 1727096171.39759: variable 'ansible_shell_executable' from source: unknown 11792 1727096171.39766: variable 'ansible_connection' from source: unknown 11792 1727096171.39775: variable 'ansible_module_compression' from source: unknown 11792 1727096171.39781: variable 'ansible_shell_type' from source: unknown 11792 1727096171.39787: variable 'ansible_shell_executable' from source: unknown 11792 1727096171.39793: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096171.39801: variable 'ansible_pipelining' from source: unknown 11792 1727096171.39808: variable 'ansible_timeout' from source: unknown 11792 1727096171.39815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096171.39924: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096171.40173: variable 'omit' from source: magic vars 11792 1727096171.40176: starting attempt loop 11792 1727096171.40178: running the handler 11792 1727096171.40179: variable 'ansible_facts' from source: unknown 11792 1727096171.40729: _low_level_execute_command(): starting 11792 1727096171.40744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096171.41457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096171.41480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096171.41501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.41610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096171.41635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096171.41705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096171.43400: stdout chunk (state=3): >>>/root <<< 11792 1727096171.43502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096171.43532: stderr chunk (state=3): >>><<< 11792 1727096171.43535: stdout chunk (state=3): >>><<< 11792 1727096171.43556: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096171.43571: _low_level_execute_command(): starting 11792 1727096171.43576: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350 `" && echo ansible-tmp-1727096171.435588-14307-9911594161350="` echo /root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350 `" ) && sleep 0' 11792 1727096171.44016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096171.44019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096171.44022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.44025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.44027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.44078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096171.44081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096171.44121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096171.46085: stdout chunk (state=3): >>>ansible-tmp-1727096171.435588-14307-9911594161350=/root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350 <<< 11792 1727096171.46220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096171.46223: stdout chunk (state=3): >>><<< 11792 1727096171.46230: stderr chunk (state=3): >>><<< 11792 1727096171.46245: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096171.435588-14307-9911594161350=/root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096171.46276: variable 'ansible_module_compression' from source: unknown 11792 1727096171.46319: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11792 1727096171.46369: variable 'ansible_facts' from source: unknown 11792 1727096171.46506: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/AnsiballZ_systemd.py 11792 1727096171.46613: Sending initial data 11792 1727096171.46616: Sent initial data (153 bytes) 11792 1727096171.47041: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.47080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096171.47083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.47085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.47087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.47136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096171.47139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096171.47142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096171.47183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096171.48796: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096171.48825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096171.48864: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpz2gtd7cs /root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/AnsiballZ_systemd.py <<< 11792 1727096171.48875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/AnsiballZ_systemd.py" <<< 11792 1727096171.48895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpz2gtd7cs" to remote "/root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/AnsiballZ_systemd.py" <<< 11792 1727096171.48898: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/AnsiballZ_systemd.py" <<< 11792 1727096171.50164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096171.50170: stderr chunk (state=3): >>><<< 11792 1727096171.50172: stdout chunk (state=3): >>><<< 11792 1727096171.50174: done transferring module to remote 11792 1727096171.50176: _low_level_execute_command(): starting 11792 1727096171.50178: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/ /root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/AnsiballZ_systemd.py && sleep 0' 11792 1727096171.50759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096171.50763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.50779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096171.50783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.50836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.50839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.50888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096171.50957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096171.50993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096171.52917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096171.52922: stdout chunk (state=3): >>><<< 11792 1727096171.52924: stderr chunk (state=3): >>><<< 11792 1727096171.52944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096171.53005: _low_level_execute_command(): starting 11792 1727096171.53008: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/AnsiballZ_systemd.py && sleep 0' 11792 1727096171.54077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096171.54090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096171.54093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.54096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096171.54098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096171.54100: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096171.54103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096171.54105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.54107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096171.54109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096171.54111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096171.54155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096171.84089: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4464640", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296575488", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "706910000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 11792 1727096171.84109: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11792 1727096171.86066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096171.86074: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 11792 1727096171.86132: stderr chunk (state=3): >>><<< 11792 1727096171.86136: stdout chunk (state=3): >>><<< 11792 1727096171.86159: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4464640", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296575488", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "706910000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096171.86333: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096171.86351: _low_level_execute_command(): starting 11792 1727096171.86358: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096171.435588-14307-9911594161350/ > /dev/null 2>&1 && sleep 0' 11792 1727096171.86971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096171.87003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096171.87017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.87030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096171.87043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096171.87056: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096171.87063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.87080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096171.87126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096171.87129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096171.87131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096171.87133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096171.87135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096171.87143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096171.87177: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096171.87183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096171.87247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096171.87283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096171.87297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096171.87378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096171.89357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096171.89361: stdout chunk (state=3): >>><<< 11792 1727096171.89364: stderr chunk (state=3): >>><<< 11792 1727096171.89576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096171.89579: handler run complete 11792 1727096171.89582: attempt loop complete, returning result 11792 1727096171.89584: _execute() done 11792 1727096171.89586: dumping result to json 11792 1727096171.89587: done dumping result, returning 11792 1727096171.89589: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-d9c7-3fc0-000000000a3a] 11792 1727096171.89592: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3a ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096171.89877: no more pending results, returning what we have 11792 1727096171.89881: results queue empty 11792 1727096171.89882: checking for any_errors_fatal 11792 1727096171.89890: done checking for any_errors_fatal 11792 1727096171.89891: checking for max_fail_percentage 11792 1727096171.89893: done checking for max_fail_percentage 11792 1727096171.89894: checking to see if all hosts have failed and the running result is not ok 11792 1727096171.89895: done checking to see if all hosts have failed 11792 1727096171.89895: getting the remaining hosts for this loop 11792 1727096171.89897: done getting the remaining hosts for this loop 11792 1727096171.89901: getting the next task for host managed_node2 11792 1727096171.89908: done getting next task for host managed_node2 11792 1727096171.89912: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11792 1727096171.89916: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096171.89935: getting variables 11792 1727096171.89936: in VariableManager get_vars() 11792 1727096171.90017: Calling all_inventory to load vars for managed_node2 11792 1727096171.90020: Calling groups_inventory to load vars for managed_node2 11792 1727096171.90023: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096171.90182: Calling all_plugins_play to load vars for managed_node2 11792 1727096171.90186: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096171.90193: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3a 11792 1727096171.90201: WORKER PROCESS EXITING 11792 1727096171.90205: Calling groups_plugins_play to load vars for managed_node2 11792 1727096171.92063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096171.93689: done with get_vars() 11792 1727096171.93718: done getting variables 11792 1727096171.93789: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:56:11 -0400 (0:00:00.670) 0:00:54.217 ****** 11792 1727096171.93826: entering _queue_task() for managed_node2/service 11792 1727096171.94290: worker is 1 (out of 1 available) 11792 1727096171.94306: exiting _queue_task() for managed_node2/service 11792 1727096171.94318: done queuing things up, now waiting for results queue to drain 11792 1727096171.94319: waiting for pending results... 11792 1727096171.94543: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11792 1727096171.94708: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a3b 11792 1727096171.94731: variable 'ansible_search_path' from source: unknown 11792 1727096171.94741: variable 'ansible_search_path' from source: unknown 11792 1727096171.94790: calling self._execute() 11792 1727096171.94905: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096171.94917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096171.94930: variable 'omit' from source: magic vars 11792 1727096171.95337: variable 'ansible_distribution_major_version' from source: facts 11792 1727096171.95355: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096171.95486: variable 'network_provider' from source: set_fact 11792 1727096171.95502: Evaluated conditional (network_provider == "nm"): True 11792 1727096171.95634: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096171.95705: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096171.95893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096171.98199: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096171.98328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096171.98332: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096171.98373: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096171.98403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096171.98507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.98548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.98583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.98657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.98660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.98706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.98734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.98876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.98879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.98882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.98884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096171.98911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096171.98939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096171.98994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096171.99012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096171.99173: variable 'network_connections' from source: task vars 11792 1727096171.99194: variable 'controller_profile' from source: play vars 11792 1727096171.99276: variable 'controller_profile' from source: play vars 11792 1727096171.99311: variable 'controller_device' from source: play vars 11792 1727096171.99363: variable 'controller_device' from source: play vars 11792 1727096171.99379: variable 'dhcp_interface1' from source: play vars 11792 1727096171.99472: variable 'dhcp_interface1' from source: play vars 11792 1727096171.99475: variable 'port1_profile' from source: play vars 11792 1727096171.99531: variable 'port1_profile' from source: play vars 11792 1727096171.99543: variable 'dhcp_interface1' from source: play vars 11792 1727096171.99608: variable 'dhcp_interface1' from source: play vars 11792 1727096171.99638: variable 'controller_profile' from source: play vars 11792 1727096171.99690: variable 'controller_profile' from source: play vars 11792 1727096171.99702: variable 'port2_profile' from source: play vars 11792 1727096171.99856: variable 'port2_profile' from source: play vars 11792 1727096171.99860: variable 'dhcp_interface2' from source: play vars 11792 1727096171.99862: variable 'dhcp_interface2' from source: play vars 11792 1727096171.99864: variable 'controller_profile' from source: play vars 11792 1727096171.99923: variable 'controller_profile' from source: play vars 11792 1727096172.00009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096172.00202: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096172.00247: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096172.00301: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096172.00338: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096172.00404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096172.00472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096172.00483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096172.00495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096172.00560: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096172.00845: variable 'network_connections' from source: task vars 11792 1727096172.00861: variable 'controller_profile' from source: play vars 11792 1727096172.00924: variable 'controller_profile' from source: play vars 11792 1727096172.00936: variable 'controller_device' from source: play vars 11792 1727096172.01007: variable 'controller_device' from source: play vars 11792 1727096172.01020: variable 'dhcp_interface1' from source: play vars 11792 1727096172.01090: variable 'dhcp_interface1' from source: play vars 11792 1727096172.01174: variable 'port1_profile' from source: play vars 11792 1727096172.01177: variable 'port1_profile' from source: play vars 11792 1727096172.01179: variable 'dhcp_interface1' from source: play vars 11792 1727096172.01236: variable 'dhcp_interface1' from source: play vars 11792 1727096172.01245: variable 'controller_profile' from source: play vars 11792 1727096172.01309: variable 'controller_profile' from source: play vars 11792 1727096172.01319: variable 'port2_profile' from source: play vars 11792 1727096172.01375: variable 'port2_profile' from source: play vars 11792 1727096172.01392: variable 'dhcp_interface2' from source: play vars 11792 1727096172.01445: variable 'dhcp_interface2' from source: play vars 11792 1727096172.01461: variable 'controller_profile' from source: play vars 11792 1727096172.01534: variable 'controller_profile' from source: play vars 11792 1727096172.01591: Evaluated conditional (__network_wpa_supplicant_required): False 11792 1727096172.01605: when evaluation is False, skipping this task 11792 1727096172.01674: _execute() done 11792 1727096172.01678: dumping result to json 11792 1727096172.01680: done dumping result, returning 11792 1727096172.01682: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-d9c7-3fc0-000000000a3b] 11792 1727096172.01684: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3b 11792 1727096172.01985: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3b 11792 1727096172.01989: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11792 1727096172.02039: no more pending results, returning what we have 11792 1727096172.02043: results queue empty 11792 1727096172.02044: checking for any_errors_fatal 11792 1727096172.02069: done checking for any_errors_fatal 11792 1727096172.02070: checking for max_fail_percentage 11792 1727096172.02072: done checking for max_fail_percentage 11792 1727096172.02073: checking to see if all hosts have failed and the running result is not ok 11792 1727096172.02073: done checking to see if all hosts have failed 11792 1727096172.02074: getting the remaining hosts for this loop 11792 1727096172.02076: done getting the remaining hosts for this loop 11792 1727096172.02080: getting the next task for host managed_node2 11792 1727096172.02087: done getting next task for host managed_node2 11792 1727096172.02091: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11792 1727096172.02097: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096172.02119: getting variables 11792 1727096172.02121: in VariableManager get_vars() 11792 1727096172.02177: Calling all_inventory to load vars for managed_node2 11792 1727096172.02181: Calling groups_inventory to load vars for managed_node2 11792 1727096172.02183: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096172.02196: Calling all_plugins_play to load vars for managed_node2 11792 1727096172.02199: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096172.02202: Calling groups_plugins_play to load vars for managed_node2 11792 1727096172.03789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096172.05450: done with get_vars() 11792 1727096172.05487: done getting variables 11792 1727096172.05552: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:56:12 -0400 (0:00:00.117) 0:00:54.335 ****** 11792 1727096172.05593: entering _queue_task() for managed_node2/service 11792 1727096172.06075: worker is 1 (out of 1 available) 11792 1727096172.06089: exiting _queue_task() for managed_node2/service 11792 1727096172.06101: done queuing things up, now waiting for results queue to drain 11792 1727096172.06102: waiting for pending results... 11792 1727096172.06313: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11792 1727096172.06518: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a3c 11792 1727096172.06522: variable 'ansible_search_path' from source: unknown 11792 1727096172.06525: variable 'ansible_search_path' from source: unknown 11792 1727096172.06626: calling self._execute() 11792 1727096172.06678: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096172.06690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096172.06702: variable 'omit' from source: magic vars 11792 1727096172.07120: variable 'ansible_distribution_major_version' from source: facts 11792 1727096172.07139: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096172.07269: variable 'network_provider' from source: set_fact 11792 1727096172.07285: Evaluated conditional (network_provider == "initscripts"): False 11792 1727096172.07293: when evaluation is False, skipping this task 11792 1727096172.07300: _execute() done 11792 1727096172.07307: dumping result to json 11792 1727096172.07345: done dumping result, returning 11792 1727096172.07349: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-d9c7-3fc0-000000000a3c] 11792 1727096172.07351: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3c skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096172.07539: no more pending results, returning what we have 11792 1727096172.07542: results queue empty 11792 1727096172.07543: checking for any_errors_fatal 11792 1727096172.07555: done checking for any_errors_fatal 11792 1727096172.07555: checking for max_fail_percentage 11792 1727096172.07557: done checking for max_fail_percentage 11792 1727096172.07558: checking to see if all hosts have failed and the running result is not ok 11792 1727096172.07559: done checking to see if all hosts have failed 11792 1727096172.07560: getting the remaining hosts for this loop 11792 1727096172.07561: done getting the remaining hosts for this loop 11792 1727096172.07565: getting the next task for host managed_node2 11792 1727096172.07576: done getting next task for host managed_node2 11792 1727096172.07581: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11792 1727096172.07588: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096172.07617: getting variables 11792 1727096172.07619: in VariableManager get_vars() 11792 1727096172.07888: Calling all_inventory to load vars for managed_node2 11792 1727096172.07892: Calling groups_inventory to load vars for managed_node2 11792 1727096172.07895: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096172.07902: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3c 11792 1727096172.07905: WORKER PROCESS EXITING 11792 1727096172.07915: Calling all_plugins_play to load vars for managed_node2 11792 1727096172.07918: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096172.07921: Calling groups_plugins_play to load vars for managed_node2 11792 1727096172.09607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096172.11200: done with get_vars() 11792 1727096172.11230: done getting variables 11792 1727096172.11301: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:56:12 -0400 (0:00:00.057) 0:00:54.392 ****** 11792 1727096172.11344: entering _queue_task() for managed_node2/copy 11792 1727096172.11735: worker is 1 (out of 1 available) 11792 1727096172.11751: exiting _queue_task() for managed_node2/copy 11792 1727096172.11765: done queuing things up, now waiting for results queue to drain 11792 1727096172.11767: waiting for pending results... 11792 1727096172.12071: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11792 1727096172.12261: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a3d 11792 1727096172.12285: variable 'ansible_search_path' from source: unknown 11792 1727096172.12294: variable 'ansible_search_path' from source: unknown 11792 1727096172.12339: calling self._execute() 11792 1727096172.12445: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096172.12520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096172.12523: variable 'omit' from source: magic vars 11792 1727096172.12868: variable 'ansible_distribution_major_version' from source: facts 11792 1727096172.12884: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096172.13010: variable 'network_provider' from source: set_fact 11792 1727096172.13020: Evaluated conditional (network_provider == "initscripts"): False 11792 1727096172.13027: when evaluation is False, skipping this task 11792 1727096172.13035: _execute() done 11792 1727096172.13042: dumping result to json 11792 1727096172.13049: done dumping result, returning 11792 1727096172.13075: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-d9c7-3fc0-000000000a3d] 11792 1727096172.13085: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3d 11792 1727096172.13251: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3d 11792 1727096172.13257: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11792 1727096172.13326: no more pending results, returning what we have 11792 1727096172.13330: results queue empty 11792 1727096172.13331: checking for any_errors_fatal 11792 1727096172.13338: done checking for any_errors_fatal 11792 1727096172.13339: checking for max_fail_percentage 11792 1727096172.13341: done checking for max_fail_percentage 11792 1727096172.13342: checking to see if all hosts have failed and the running result is not ok 11792 1727096172.13343: done checking to see if all hosts have failed 11792 1727096172.13344: getting the remaining hosts for this loop 11792 1727096172.13346: done getting the remaining hosts for this loop 11792 1727096172.13349: getting the next task for host managed_node2 11792 1727096172.13362: done getting next task for host managed_node2 11792 1727096172.13365: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11792 1727096172.13374: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096172.13397: getting variables 11792 1727096172.13398: in VariableManager get_vars() 11792 1727096172.13444: Calling all_inventory to load vars for managed_node2 11792 1727096172.13447: Calling groups_inventory to load vars for managed_node2 11792 1727096172.13450: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096172.13465: Calling all_plugins_play to load vars for managed_node2 11792 1727096172.13585: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096172.13589: Calling groups_plugins_play to load vars for managed_node2 11792 1727096172.15199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096172.16830: done with get_vars() 11792 1727096172.16873: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:56:12 -0400 (0:00:00.056) 0:00:54.448 ****** 11792 1727096172.16972: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11792 1727096172.17366: worker is 1 (out of 1 available) 11792 1727096172.17582: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11792 1727096172.17594: done queuing things up, now waiting for results queue to drain 11792 1727096172.17595: waiting for pending results... 11792 1727096172.17805: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11792 1727096172.17974: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a3e 11792 1727096172.17977: variable 'ansible_search_path' from source: unknown 11792 1727096172.17979: variable 'ansible_search_path' from source: unknown 11792 1727096172.18011: calling self._execute() 11792 1727096172.18122: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096172.18134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096172.18151: variable 'omit' from source: magic vars 11792 1727096172.18574: variable 'ansible_distribution_major_version' from source: facts 11792 1727096172.18578: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096172.18587: variable 'omit' from source: magic vars 11792 1727096172.18673: variable 'omit' from source: magic vars 11792 1727096172.18875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096172.21401: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096172.21495: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096172.21535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096172.21591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096172.21602: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096172.21683: variable 'network_provider' from source: set_fact 11792 1727096172.21832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096172.21918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096172.21921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096172.21946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096172.21969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096172.22057: variable 'omit' from source: magic vars 11792 1727096172.22178: variable 'omit' from source: magic vars 11792 1727096172.22292: variable 'network_connections' from source: task vars 11792 1727096172.22308: variable 'controller_profile' from source: play vars 11792 1727096172.22385: variable 'controller_profile' from source: play vars 11792 1727096172.22463: variable 'controller_device' from source: play vars 11792 1727096172.22467: variable 'controller_device' from source: play vars 11792 1727096172.22488: variable 'dhcp_interface1' from source: play vars 11792 1727096172.22550: variable 'dhcp_interface1' from source: play vars 11792 1727096172.22573: variable 'port1_profile' from source: play vars 11792 1727096172.22638: variable 'port1_profile' from source: play vars 11792 1727096172.22655: variable 'dhcp_interface1' from source: play vars 11792 1727096172.22725: variable 'dhcp_interface1' from source: play vars 11792 1727096172.22738: variable 'controller_profile' from source: play vars 11792 1727096172.22898: variable 'controller_profile' from source: play vars 11792 1727096172.22903: variable 'port2_profile' from source: play vars 11792 1727096172.22905: variable 'port2_profile' from source: play vars 11792 1727096172.22907: variable 'dhcp_interface2' from source: play vars 11792 1727096172.22958: variable 'dhcp_interface2' from source: play vars 11792 1727096172.22973: variable 'controller_profile' from source: play vars 11792 1727096172.23044: variable 'controller_profile' from source: play vars 11792 1727096172.23260: variable 'omit' from source: magic vars 11792 1727096172.23275: variable '__lsr_ansible_managed' from source: task vars 11792 1727096172.23343: variable '__lsr_ansible_managed' from source: task vars 11792 1727096172.23552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11792 1727096172.23880: Loaded config def from plugin (lookup/template) 11792 1727096172.23883: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11792 1727096172.23887: File lookup term: get_ansible_managed.j2 11792 1727096172.23889: variable 'ansible_search_path' from source: unknown 11792 1727096172.23891: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11792 1727096172.23895: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11792 1727096172.23897: variable 'ansible_search_path' from source: unknown 11792 1727096172.41306: variable 'ansible_managed' from source: unknown 11792 1727096172.41580: variable 'omit' from source: magic vars 11792 1727096172.41584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096172.41587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096172.41589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096172.41592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096172.41599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096172.41602: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096172.41604: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096172.41605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096172.41710: Set connection var ansible_timeout to 10 11792 1727096172.41715: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096172.41877: Set connection var ansible_shell_executable to /bin/sh 11792 1727096172.41880: Set connection var ansible_pipelining to False 11792 1727096172.41882: Set connection var ansible_shell_type to sh 11792 1727096172.41884: Set connection var ansible_connection to ssh 11792 1727096172.41886: variable 'ansible_shell_executable' from source: unknown 11792 1727096172.41888: variable 'ansible_connection' from source: unknown 11792 1727096172.41890: variable 'ansible_module_compression' from source: unknown 11792 1727096172.41892: variable 'ansible_shell_type' from source: unknown 11792 1727096172.41894: variable 'ansible_shell_executable' from source: unknown 11792 1727096172.41896: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096172.41898: variable 'ansible_pipelining' from source: unknown 11792 1727096172.41902: variable 'ansible_timeout' from source: unknown 11792 1727096172.41904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096172.41907: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096172.41910: variable 'omit' from source: magic vars 11792 1727096172.41912: starting attempt loop 11792 1727096172.41913: running the handler 11792 1727096172.41915: _low_level_execute_command(): starting 11792 1727096172.41917: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096172.42689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096172.42728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096172.42792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096172.42796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096172.42825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096172.42890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096172.44612: stdout chunk (state=3): >>>/root <<< 11792 1727096172.44762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096172.44766: stdout chunk (state=3): >>><<< 11792 1727096172.44771: stderr chunk (state=3): >>><<< 11792 1727096172.44790: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096172.44808: _low_level_execute_command(): starting 11792 1727096172.44820: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556 `" && echo ansible-tmp-1727096172.4479191-14342-61059849199556="` echo /root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556 `" ) && sleep 0' 11792 1727096172.45564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096172.45586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096172.45602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096172.45618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096172.45639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096172.45682: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096172.45781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096172.45784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096172.45831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096172.45869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096172.47872: stdout chunk (state=3): >>>ansible-tmp-1727096172.4479191-14342-61059849199556=/root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556 <<< 11792 1727096172.47980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096172.48009: stderr chunk (state=3): >>><<< 11792 1727096172.48012: stdout chunk (state=3): >>><<< 11792 1727096172.48031: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096172.4479191-14342-61059849199556=/root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096172.48074: variable 'ansible_module_compression' from source: unknown 11792 1727096172.48140: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11792 1727096172.48177: variable 'ansible_facts' from source: unknown 11792 1727096172.48572: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/AnsiballZ_network_connections.py 11792 1727096172.48576: Sending initial data 11792 1727096172.48579: Sent initial data (167 bytes) 11792 1727096172.49271: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096172.49282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096172.49319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096172.49324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096172.49407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096172.49506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096172.49709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096172.51266: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096172.51297: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096172.51366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp9oqspysw /root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/AnsiballZ_network_connections.py <<< 11792 1727096172.51371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/AnsiballZ_network_connections.py" <<< 11792 1727096172.51400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp9oqspysw" to remote "/root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/AnsiballZ_network_connections.py" <<< 11792 1727096172.52448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096172.52452: stderr chunk (state=3): >>><<< 11792 1727096172.52457: stdout chunk (state=3): >>><<< 11792 1727096172.52584: done transferring module to remote 11792 1727096172.52594: _low_level_execute_command(): starting 11792 1727096172.52652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/ /root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/AnsiballZ_network_connections.py && sleep 0' 11792 1727096172.53533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096172.53537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096172.53594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096172.53641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096172.53644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096172.53662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096172.53738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096172.55781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096172.55786: stdout chunk (state=3): >>><<< 11792 1727096172.55788: stderr chunk (state=3): >>><<< 11792 1727096172.55791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096172.55794: _low_level_execute_command(): starting 11792 1727096172.55796: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/AnsiballZ_network_connections.py && sleep 0' 11792 1727096172.56462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096172.56566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096172.56626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096172.56676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096172.99478: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11792 1727096173.01611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096173.01640: stderr chunk (state=3): >>><<< 11792 1727096173.01645: stdout chunk (state=3): >>><<< 11792 1727096173.01664: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096173.01710: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'arp_interval': 60, 'arp_ip_target': '192.0.2.128', 'arp_validate': 'none', 'primary': 'test1'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096173.01717: _low_level_execute_command(): starting 11792 1727096173.01722: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096172.4479191-14342-61059849199556/ > /dev/null 2>&1 && sleep 0' 11792 1727096173.02164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096173.02174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096173.02204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.02207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096173.02209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096173.02212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.02274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096173.02281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096173.02283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096173.02317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096173.04312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096173.04336: stderr chunk (state=3): >>><<< 11792 1727096173.04340: stdout chunk (state=3): >>><<< 11792 1727096173.04357: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096173.04360: handler run complete 11792 1727096173.04393: attempt loop complete, returning result 11792 1727096173.04396: _execute() done 11792 1727096173.04399: dumping result to json 11792 1727096173.04403: done dumping result, returning 11792 1727096173.04411: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-d9c7-3fc0-000000000a3e] 11792 1727096173.04414: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3e 11792 1727096173.04533: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3e 11792 1727096173.04536: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 (not-active) 11792 1727096173.04681: no more pending results, returning what we have 11792 1727096173.04684: results queue empty 11792 1727096173.04685: checking for any_errors_fatal 11792 1727096173.04692: done checking for any_errors_fatal 11792 1727096173.04692: checking for max_fail_percentage 11792 1727096173.04694: done checking for max_fail_percentage 11792 1727096173.04695: checking to see if all hosts have failed and the running result is not ok 11792 1727096173.04695: done checking to see if all hosts have failed 11792 1727096173.04696: getting the remaining hosts for this loop 11792 1727096173.04698: done getting the remaining hosts for this loop 11792 1727096173.04701: getting the next task for host managed_node2 11792 1727096173.04708: done getting next task for host managed_node2 11792 1727096173.04711: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11792 1727096173.04715: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096173.04726: getting variables 11792 1727096173.04727: in VariableManager get_vars() 11792 1727096173.04773: Calling all_inventory to load vars for managed_node2 11792 1727096173.04776: Calling groups_inventory to load vars for managed_node2 11792 1727096173.04782: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096173.04791: Calling all_plugins_play to load vars for managed_node2 11792 1727096173.04794: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096173.04797: Calling groups_plugins_play to load vars for managed_node2 11792 1727096173.05721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096173.06605: done with get_vars() 11792 1727096173.06625: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:56:13 -0400 (0:00:00.897) 0:00:55.346 ****** 11792 1727096173.06699: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11792 1727096173.06976: worker is 1 (out of 1 available) 11792 1727096173.06990: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11792 1727096173.07002: done queuing things up, now waiting for results queue to drain 11792 1727096173.07004: waiting for pending results... 11792 1727096173.07190: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11792 1727096173.07296: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a3f 11792 1727096173.07309: variable 'ansible_search_path' from source: unknown 11792 1727096173.07313: variable 'ansible_search_path' from source: unknown 11792 1727096173.07342: calling self._execute() 11792 1727096173.07425: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.07429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.07438: variable 'omit' from source: magic vars 11792 1727096173.07718: variable 'ansible_distribution_major_version' from source: facts 11792 1727096173.07727: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096173.07813: variable 'network_state' from source: role '' defaults 11792 1727096173.07822: Evaluated conditional (network_state != {}): False 11792 1727096173.07825: when evaluation is False, skipping this task 11792 1727096173.07827: _execute() done 11792 1727096173.07831: dumping result to json 11792 1727096173.07834: done dumping result, returning 11792 1727096173.07842: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-d9c7-3fc0-000000000a3f] 11792 1727096173.07844: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3f 11792 1727096173.07934: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a3f 11792 1727096173.07937: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096173.07991: no more pending results, returning what we have 11792 1727096173.07996: results queue empty 11792 1727096173.07997: checking for any_errors_fatal 11792 1727096173.08012: done checking for any_errors_fatal 11792 1727096173.08012: checking for max_fail_percentage 11792 1727096173.08014: done checking for max_fail_percentage 11792 1727096173.08015: checking to see if all hosts have failed and the running result is not ok 11792 1727096173.08016: done checking to see if all hosts have failed 11792 1727096173.08016: getting the remaining hosts for this loop 11792 1727096173.08018: done getting the remaining hosts for this loop 11792 1727096173.08021: getting the next task for host managed_node2 11792 1727096173.08029: done getting next task for host managed_node2 11792 1727096173.08032: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11792 1727096173.08037: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096173.08060: getting variables 11792 1727096173.08061: in VariableManager get_vars() 11792 1727096173.08102: Calling all_inventory to load vars for managed_node2 11792 1727096173.08104: Calling groups_inventory to load vars for managed_node2 11792 1727096173.08106: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096173.08115: Calling all_plugins_play to load vars for managed_node2 11792 1727096173.08118: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096173.08120: Calling groups_plugins_play to load vars for managed_node2 11792 1727096173.08914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096173.13673: done with get_vars() 11792 1727096173.13693: done getting variables 11792 1727096173.13730: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:56:13 -0400 (0:00:00.070) 0:00:55.416 ****** 11792 1727096173.13752: entering _queue_task() for managed_node2/debug 11792 1727096173.14036: worker is 1 (out of 1 available) 11792 1727096173.14050: exiting _queue_task() for managed_node2/debug 11792 1727096173.14063: done queuing things up, now waiting for results queue to drain 11792 1727096173.14065: waiting for pending results... 11792 1727096173.14414: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11792 1727096173.14487: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a40 11792 1727096173.14510: variable 'ansible_search_path' from source: unknown 11792 1727096173.14520: variable 'ansible_search_path' from source: unknown 11792 1727096173.14566: calling self._execute() 11792 1727096173.14775: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.14779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.14782: variable 'omit' from source: magic vars 11792 1727096173.15102: variable 'ansible_distribution_major_version' from source: facts 11792 1727096173.15126: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096173.15138: variable 'omit' from source: magic vars 11792 1727096173.15230: variable 'omit' from source: magic vars 11792 1727096173.15296: variable 'omit' from source: magic vars 11792 1727096173.15341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096173.15378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096173.15394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096173.15408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096173.15416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096173.15441: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096173.15445: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.15447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.15527: Set connection var ansible_timeout to 10 11792 1727096173.15533: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096173.15540: Set connection var ansible_shell_executable to /bin/sh 11792 1727096173.15546: Set connection var ansible_pipelining to False 11792 1727096173.15549: Set connection var ansible_shell_type to sh 11792 1727096173.15551: Set connection var ansible_connection to ssh 11792 1727096173.15573: variable 'ansible_shell_executable' from source: unknown 11792 1727096173.15578: variable 'ansible_connection' from source: unknown 11792 1727096173.15581: variable 'ansible_module_compression' from source: unknown 11792 1727096173.15584: variable 'ansible_shell_type' from source: unknown 11792 1727096173.15586: variable 'ansible_shell_executable' from source: unknown 11792 1727096173.15589: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.15591: variable 'ansible_pipelining' from source: unknown 11792 1727096173.15594: variable 'ansible_timeout' from source: unknown 11792 1727096173.15596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.15699: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096173.15716: variable 'omit' from source: magic vars 11792 1727096173.15719: starting attempt loop 11792 1727096173.15722: running the handler 11792 1727096173.15816: variable '__network_connections_result' from source: set_fact 11792 1727096173.15870: handler run complete 11792 1727096173.15883: attempt loop complete, returning result 11792 1727096173.15886: _execute() done 11792 1727096173.15888: dumping result to json 11792 1727096173.15891: done dumping result, returning 11792 1727096173.15900: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-d9c7-3fc0-000000000a40] 11792 1727096173.15902: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a40 11792 1727096173.15991: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a40 11792 1727096173.15994: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 (not-active)" ] } 11792 1727096173.16063: no more pending results, returning what we have 11792 1727096173.16069: results queue empty 11792 1727096173.16070: checking for any_errors_fatal 11792 1727096173.16077: done checking for any_errors_fatal 11792 1727096173.16077: checking for max_fail_percentage 11792 1727096173.16079: done checking for max_fail_percentage 11792 1727096173.16080: checking to see if all hosts have failed and the running result is not ok 11792 1727096173.16080: done checking to see if all hosts have failed 11792 1727096173.16081: getting the remaining hosts for this loop 11792 1727096173.16082: done getting the remaining hosts for this loop 11792 1727096173.16086: getting the next task for host managed_node2 11792 1727096173.16093: done getting next task for host managed_node2 11792 1727096173.16097: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11792 1727096173.16104: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096173.16115: getting variables 11792 1727096173.16116: in VariableManager get_vars() 11792 1727096173.16157: Calling all_inventory to load vars for managed_node2 11792 1727096173.16160: Calling groups_inventory to load vars for managed_node2 11792 1727096173.16162: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096173.16179: Calling all_plugins_play to load vars for managed_node2 11792 1727096173.16188: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096173.16192: Calling groups_plugins_play to load vars for managed_node2 11792 1727096173.16982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096173.18566: done with get_vars() 11792 1727096173.18597: done getting variables 11792 1727096173.18665: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:56:13 -0400 (0:00:00.049) 0:00:55.466 ****** 11792 1727096173.18712: entering _queue_task() for managed_node2/debug 11792 1727096173.19095: worker is 1 (out of 1 available) 11792 1727096173.19110: exiting _queue_task() for managed_node2/debug 11792 1727096173.19124: done queuing things up, now waiting for results queue to drain 11792 1727096173.19126: waiting for pending results... 11792 1727096173.19588: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11792 1727096173.19601: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a41 11792 1727096173.19622: variable 'ansible_search_path' from source: unknown 11792 1727096173.19629: variable 'ansible_search_path' from source: unknown 11792 1727096173.19674: calling self._execute() 11792 1727096173.19782: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.19799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.19813: variable 'omit' from source: magic vars 11792 1727096173.20212: variable 'ansible_distribution_major_version' from source: facts 11792 1727096173.20335: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096173.20339: variable 'omit' from source: magic vars 11792 1727096173.20341: variable 'omit' from source: magic vars 11792 1727096173.20372: variable 'omit' from source: magic vars 11792 1727096173.20415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096173.20464: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096173.20492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096173.20513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096173.20527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096173.20565: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096173.20576: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.20583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.20691: Set connection var ansible_timeout to 10 11792 1727096173.20704: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096173.20717: Set connection var ansible_shell_executable to /bin/sh 11792 1727096173.20725: Set connection var ansible_pipelining to False 11792 1727096173.20733: Set connection var ansible_shell_type to sh 11792 1727096173.20740: Set connection var ansible_connection to ssh 11792 1727096173.20773: variable 'ansible_shell_executable' from source: unknown 11792 1727096173.20781: variable 'ansible_connection' from source: unknown 11792 1727096173.20788: variable 'ansible_module_compression' from source: unknown 11792 1727096173.20794: variable 'ansible_shell_type' from source: unknown 11792 1727096173.20800: variable 'ansible_shell_executable' from source: unknown 11792 1727096173.20874: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.20877: variable 'ansible_pipelining' from source: unknown 11792 1727096173.20879: variable 'ansible_timeout' from source: unknown 11792 1727096173.20881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.20970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096173.20991: variable 'omit' from source: magic vars 11792 1727096173.21003: starting attempt loop 11792 1727096173.21010: running the handler 11792 1727096173.21065: variable '__network_connections_result' from source: set_fact 11792 1727096173.21159: variable '__network_connections_result' from source: set_fact 11792 1727096173.21342: handler run complete 11792 1727096173.21379: attempt loop complete, returning result 11792 1727096173.21386: _execute() done 11792 1727096173.21392: dumping result to json 11792 1727096173.21415: done dumping result, returning 11792 1727096173.21418: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-d9c7-3fc0-000000000a41] 11792 1727096173.21420: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a41 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 (not-active)" ] } } 11792 1727096173.21674: no more pending results, returning what we have 11792 1727096173.21678: results queue empty 11792 1727096173.21686: checking for any_errors_fatal 11792 1727096173.21692: done checking for any_errors_fatal 11792 1727096173.21693: checking for max_fail_percentage 11792 1727096173.21695: done checking for max_fail_percentage 11792 1727096173.21696: checking to see if all hosts have failed and the running result is not ok 11792 1727096173.21697: done checking to see if all hosts have failed 11792 1727096173.21697: getting the remaining hosts for this loop 11792 1727096173.21699: done getting the remaining hosts for this loop 11792 1727096173.21703: getting the next task for host managed_node2 11792 1727096173.21712: done getting next task for host managed_node2 11792 1727096173.21716: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11792 1727096173.21720: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096173.21734: getting variables 11792 1727096173.21736: in VariableManager get_vars() 11792 1727096173.22128: Calling all_inventory to load vars for managed_node2 11792 1727096173.22131: Calling groups_inventory to load vars for managed_node2 11792 1727096173.22133: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096173.22144: Calling all_plugins_play to load vars for managed_node2 11792 1727096173.22147: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096173.22150: Calling groups_plugins_play to load vars for managed_node2 11792 1727096173.22781: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a41 11792 1727096173.22785: WORKER PROCESS EXITING 11792 1727096173.24027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096173.25655: done with get_vars() 11792 1727096173.25683: done getting variables 11792 1727096173.25747: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:56:13 -0400 (0:00:00.070) 0:00:55.537 ****** 11792 1727096173.25785: entering _queue_task() for managed_node2/debug 11792 1727096173.26265: worker is 1 (out of 1 available) 11792 1727096173.26280: exiting _queue_task() for managed_node2/debug 11792 1727096173.26290: done queuing things up, now waiting for results queue to drain 11792 1727096173.26292: waiting for pending results... 11792 1727096173.26484: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11792 1727096173.26682: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a42 11792 1727096173.26708: variable 'ansible_search_path' from source: unknown 11792 1727096173.26717: variable 'ansible_search_path' from source: unknown 11792 1727096173.26763: calling self._execute() 11792 1727096173.26930: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.26934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.26938: variable 'omit' from source: magic vars 11792 1727096173.27304: variable 'ansible_distribution_major_version' from source: facts 11792 1727096173.27319: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096173.27431: variable 'network_state' from source: role '' defaults 11792 1727096173.27445: Evaluated conditional (network_state != {}): False 11792 1727096173.27451: when evaluation is False, skipping this task 11792 1727096173.27458: _execute() done 11792 1727096173.27464: dumping result to json 11792 1727096173.27476: done dumping result, returning 11792 1727096173.27491: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-d9c7-3fc0-000000000a42] 11792 1727096173.27500: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a42 11792 1727096173.27729: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a42 11792 1727096173.27733: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11792 1727096173.27789: no more pending results, returning what we have 11792 1727096173.27794: results queue empty 11792 1727096173.27799: checking for any_errors_fatal 11792 1727096173.27809: done checking for any_errors_fatal 11792 1727096173.27810: checking for max_fail_percentage 11792 1727096173.27812: done checking for max_fail_percentage 11792 1727096173.27813: checking to see if all hosts have failed and the running result is not ok 11792 1727096173.27813: done checking to see if all hosts have failed 11792 1727096173.27814: getting the remaining hosts for this loop 11792 1727096173.27816: done getting the remaining hosts for this loop 11792 1727096173.27820: getting the next task for host managed_node2 11792 1727096173.27829: done getting next task for host managed_node2 11792 1727096173.27834: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11792 1727096173.27841: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096173.27865: getting variables 11792 1727096173.27870: in VariableManager get_vars() 11792 1727096173.28097: Calling all_inventory to load vars for managed_node2 11792 1727096173.28101: Calling groups_inventory to load vars for managed_node2 11792 1727096173.28104: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096173.28114: Calling all_plugins_play to load vars for managed_node2 11792 1727096173.28117: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096173.28120: Calling groups_plugins_play to load vars for managed_node2 11792 1727096173.29788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096173.32675: done with get_vars() 11792 1727096173.32706: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:56:13 -0400 (0:00:00.071) 0:00:55.608 ****** 11792 1727096173.32930: entering _queue_task() for managed_node2/ping 11792 1727096173.33685: worker is 1 (out of 1 available) 11792 1727096173.33700: exiting _queue_task() for managed_node2/ping 11792 1727096173.33714: done queuing things up, now waiting for results queue to drain 11792 1727096173.33831: waiting for pending results... 11792 1727096173.34587: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11792 1727096173.34648: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a43 11792 1727096173.34675: variable 'ansible_search_path' from source: unknown 11792 1727096173.34682: variable 'ansible_search_path' from source: unknown 11792 1727096173.34725: calling self._execute() 11792 1727096173.35061: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.35173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.35177: variable 'omit' from source: magic vars 11792 1727096173.35711: variable 'ansible_distribution_major_version' from source: facts 11792 1727096173.35787: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096173.35801: variable 'omit' from source: magic vars 11792 1727096173.35943: variable 'omit' from source: magic vars 11792 1727096173.36110: variable 'omit' from source: magic vars 11792 1727096173.36159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096173.36204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096173.36300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096173.36322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096173.36338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096173.36392: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096173.36400: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.36407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.36538: Set connection var ansible_timeout to 10 11792 1727096173.36581: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096173.36594: Set connection var ansible_shell_executable to /bin/sh 11792 1727096173.36603: Set connection var ansible_pipelining to False 11792 1727096173.36609: Set connection var ansible_shell_type to sh 11792 1727096173.36617: Set connection var ansible_connection to ssh 11792 1727096173.36655: variable 'ansible_shell_executable' from source: unknown 11792 1727096173.36669: variable 'ansible_connection' from source: unknown 11792 1727096173.36681: variable 'ansible_module_compression' from source: unknown 11792 1727096173.36688: variable 'ansible_shell_type' from source: unknown 11792 1727096173.36694: variable 'ansible_shell_executable' from source: unknown 11792 1727096173.36700: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.36773: variable 'ansible_pipelining' from source: unknown 11792 1727096173.36776: variable 'ansible_timeout' from source: unknown 11792 1727096173.36778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.36933: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096173.36950: variable 'omit' from source: magic vars 11792 1727096173.36959: starting attempt loop 11792 1727096173.36965: running the handler 11792 1727096173.36989: _low_level_execute_command(): starting 11792 1727096173.37002: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096173.37778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.37872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096173.37938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096173.39923: stdout chunk (state=3): >>>/root <<< 11792 1727096173.39927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096173.39930: stdout chunk (state=3): >>><<< 11792 1727096173.39932: stderr chunk (state=3): >>><<< 11792 1727096173.39936: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096173.39938: _low_level_execute_command(): starting 11792 1727096173.39942: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528 `" && echo ansible-tmp-1727096173.3987494-14390-198735988786528="` echo /root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528 `" ) && sleep 0' 11792 1727096173.41026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096173.41044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096173.41077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096173.41101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096173.41115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096173.41126: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096173.41206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.41248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096173.41272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096173.41292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096173.41437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096173.43582: stdout chunk (state=3): >>>ansible-tmp-1727096173.3987494-14390-198735988786528=/root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528 <<< 11792 1727096173.43973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096173.43977: stdout chunk (state=3): >>><<< 11792 1727096173.43979: stderr chunk (state=3): >>><<< 11792 1727096173.43982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096173.3987494-14390-198735988786528=/root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096173.43984: variable 'ansible_module_compression' from source: unknown 11792 1727096173.43986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11792 1727096173.44107: variable 'ansible_facts' from source: unknown 11792 1727096173.44302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/AnsiballZ_ping.py 11792 1727096173.44561: Sending initial data 11792 1727096173.44565: Sent initial data (153 bytes) 11792 1727096173.46106: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096173.46255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096173.46597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096173.48310: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096173.48315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096173.48452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpukbx0ot4 /root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/AnsiballZ_ping.py <<< 11792 1727096173.48456: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/AnsiballZ_ping.py" <<< 11792 1727096173.48465: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpukbx0ot4" to remote "/root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/AnsiballZ_ping.py" <<< 11792 1727096173.49523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096173.49694: stderr chunk (state=3): >>><<< 11792 1727096173.49705: stdout chunk (state=3): >>><<< 11792 1727096173.49708: done transferring module to remote 11792 1727096173.49710: _low_level_execute_command(): starting 11792 1727096173.49712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/ /root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/AnsiballZ_ping.py && sleep 0' 11792 1727096173.50838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096173.50841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096173.50844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.50846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096173.50849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.51199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096173.53088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096173.53197: stderr chunk (state=3): >>><<< 11792 1727096173.53201: stdout chunk (state=3): >>><<< 11792 1727096173.53335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096173.53343: _low_level_execute_command(): starting 11792 1727096173.53346: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/AnsiballZ_ping.py && sleep 0' 11792 1727096173.54607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096173.54610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096173.54613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096173.54615: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096173.54617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.54680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096173.54683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096173.55287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096173.55427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096173.71723: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11792 1727096173.73225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096173.73231: stdout chunk (state=3): >>><<< 11792 1727096173.73234: stderr chunk (state=3): >>><<< 11792 1727096173.73256: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096173.73351: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096173.73355: _low_level_execute_command(): starting 11792 1727096173.73357: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096173.3987494-14390-198735988786528/ > /dev/null 2>&1 && sleep 0' 11792 1727096173.74505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096173.74510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.74534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096173.74540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096173.74563: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096173.74566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096173.74621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096173.74624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096173.74765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096173.74818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096173.76742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096173.76833: stderr chunk (state=3): >>><<< 11792 1727096173.76837: stdout chunk (state=3): >>><<< 11792 1727096173.76859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096173.76862: handler run complete 11792 1727096173.76881: attempt loop complete, returning result 11792 1727096173.76884: _execute() done 11792 1727096173.76887: dumping result to json 11792 1727096173.76889: done dumping result, returning 11792 1727096173.76901: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-d9c7-3fc0-000000000a43] 11792 1727096173.76909: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a43 11792 1727096173.77269: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a43 11792 1727096173.77273: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11792 1727096173.77342: no more pending results, returning what we have 11792 1727096173.77346: results queue empty 11792 1727096173.77347: checking for any_errors_fatal 11792 1727096173.77353: done checking for any_errors_fatal 11792 1727096173.77354: checking for max_fail_percentage 11792 1727096173.77356: done checking for max_fail_percentage 11792 1727096173.77357: checking to see if all hosts have failed and the running result is not ok 11792 1727096173.77358: done checking to see if all hosts have failed 11792 1727096173.77358: getting the remaining hosts for this loop 11792 1727096173.77365: done getting the remaining hosts for this loop 11792 1727096173.77371: getting the next task for host managed_node2 11792 1727096173.77383: done getting next task for host managed_node2 11792 1727096173.77385: ^ task is: TASK: meta (role_complete) 11792 1727096173.77390: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096173.77403: getting variables 11792 1727096173.77405: in VariableManager get_vars() 11792 1727096173.77456: Calling all_inventory to load vars for managed_node2 11792 1727096173.77460: Calling groups_inventory to load vars for managed_node2 11792 1727096173.77462: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096173.77780: Calling all_plugins_play to load vars for managed_node2 11792 1727096173.77788: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096173.77793: Calling groups_plugins_play to load vars for managed_node2 11792 1727096173.81314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096173.84834: done with get_vars() 11792 1727096173.84929: done getting variables 11792 1727096173.85027: done queuing things up, now waiting for results queue to drain 11792 1727096173.85031: results queue empty 11792 1727096173.85031: checking for any_errors_fatal 11792 1727096173.85034: done checking for any_errors_fatal 11792 1727096173.85035: checking for max_fail_percentage 11792 1727096173.85036: done checking for max_fail_percentage 11792 1727096173.85037: checking to see if all hosts have failed and the running result is not ok 11792 1727096173.85038: done checking to see if all hosts have failed 11792 1727096173.85038: getting the remaining hosts for this loop 11792 1727096173.85039: done getting the remaining hosts for this loop 11792 1727096173.85042: getting the next task for host managed_node2 11792 1727096173.85047: done getting next task for host managed_node2 11792 1727096173.85049: ^ task is: TASK: Show result 11792 1727096173.85052: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096173.85057: getting variables 11792 1727096173.85058: in VariableManager get_vars() 11792 1727096173.85076: Calling all_inventory to load vars for managed_node2 11792 1727096173.85078: Calling groups_inventory to load vars for managed_node2 11792 1727096173.85093: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096173.85099: Calling all_plugins_play to load vars for managed_node2 11792 1727096173.85101: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096173.85104: Calling groups_plugins_play to load vars for managed_node2 11792 1727096173.86332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096173.89048: done with get_vars() 11792 1727096173.89088: done getting variables 11792 1727096173.89140: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml:33 Monday 23 September 2024 08:56:13 -0400 (0:00:00.562) 0:00:56.170 ****** 11792 1727096173.89177: entering _queue_task() for managed_node2/debug 11792 1727096173.89534: worker is 1 (out of 1 available) 11792 1727096173.89546: exiting _queue_task() for managed_node2/debug 11792 1727096173.89561: done queuing things up, now waiting for results queue to drain 11792 1727096173.89563: waiting for pending results... 11792 1727096173.89865: running TaskExecutor() for managed_node2/TASK: Show result 11792 1727096173.90008: in run() - task 0afff68d-5257-d9c7-3fc0-000000000a73 11792 1727096173.90033: variable 'ansible_search_path' from source: unknown 11792 1727096173.90042: variable 'ansible_search_path' from source: unknown 11792 1727096173.90097: calling self._execute() 11792 1727096173.90218: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.90231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.90246: variable 'omit' from source: magic vars 11792 1727096173.90671: variable 'ansible_distribution_major_version' from source: facts 11792 1727096173.90690: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096173.90702: variable 'omit' from source: magic vars 11792 1727096173.90761: variable 'omit' from source: magic vars 11792 1727096173.90775: variable 'omit' from source: magic vars 11792 1727096173.90822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096173.90872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096173.90900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096173.90923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096173.90984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096173.90990: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096173.91122: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.91125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.91389: Set connection var ansible_timeout to 10 11792 1727096173.91403: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096173.91469: Set connection var ansible_shell_executable to /bin/sh 11792 1727096173.91481: Set connection var ansible_pipelining to False 11792 1727096173.91493: Set connection var ansible_shell_type to sh 11792 1727096173.91500: Set connection var ansible_connection to ssh 11792 1727096173.91529: variable 'ansible_shell_executable' from source: unknown 11792 1727096173.91579: variable 'ansible_connection' from source: unknown 11792 1727096173.91588: variable 'ansible_module_compression' from source: unknown 11792 1727096173.91600: variable 'ansible_shell_type' from source: unknown 11792 1727096173.91709: variable 'ansible_shell_executable' from source: unknown 11792 1727096173.91712: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.91715: variable 'ansible_pipelining' from source: unknown 11792 1727096173.91717: variable 'ansible_timeout' from source: unknown 11792 1727096173.91719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.91964: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096173.92171: variable 'omit' from source: magic vars 11792 1727096173.92184: starting attempt loop 11792 1727096173.92191: running the handler 11792 1727096173.92248: variable '__network_connections_result' from source: set_fact 11792 1727096173.92374: variable '__network_connections_result' from source: set_fact 11792 1727096173.92551: handler run complete 11792 1727096173.92600: attempt loop complete, returning result 11792 1727096173.92607: _execute() done 11792 1727096173.92613: dumping result to json 11792 1727096173.92672: done dumping result, returning 11792 1727096173.92676: done running TaskExecutor() for managed_node2/TASK: Show result [0afff68d-5257-d9c7-3fc0-000000000a73] 11792 1727096173.92678: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a73 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 904f86c6-f9de-41d4-8310-1428b04dc202 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 48cd831f-8976-4885-b9b3-6c2ccae54189 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 60ed1929-4403-4ab6-a959-1c7d3a8cd186 (not-active)" ] } } 11792 1727096173.92862: no more pending results, returning what we have 11792 1727096173.92866: results queue empty 11792 1727096173.92866: checking for any_errors_fatal 11792 1727096173.92871: done checking for any_errors_fatal 11792 1727096173.92878: checking for max_fail_percentage 11792 1727096173.92880: done checking for max_fail_percentage 11792 1727096173.92882: checking to see if all hosts have failed and the running result is not ok 11792 1727096173.92882: done checking to see if all hosts have failed 11792 1727096173.92883: getting the remaining hosts for this loop 11792 1727096173.92885: done getting the remaining hosts for this loop 11792 1727096173.92888: getting the next task for host managed_node2 11792 1727096173.92899: done getting next task for host managed_node2 11792 1727096173.92902: ^ task is: TASK: Asserts 11792 1727096173.92905: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096173.92911: getting variables 11792 1727096173.92913: in VariableManager get_vars() 11792 1727096173.92962: Calling all_inventory to load vars for managed_node2 11792 1727096173.92965: Calling groups_inventory to load vars for managed_node2 11792 1727096173.93174: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096173.93182: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000a73 11792 1727096173.93185: WORKER PROCESS EXITING 11792 1727096173.93197: Calling all_plugins_play to load vars for managed_node2 11792 1727096173.93200: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096173.93203: Calling groups_plugins_play to load vars for managed_node2 11792 1727096173.94821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096173.97469: done with get_vars() 11792 1727096173.97502: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Monday 23 September 2024 08:56:13 -0400 (0:00:00.084) 0:00:56.255 ****** 11792 1727096173.97602: entering _queue_task() for managed_node2/include_tasks 11792 1727096173.98099: worker is 1 (out of 1 available) 11792 1727096173.98111: exiting _queue_task() for managed_node2/include_tasks 11792 1727096173.98123: done queuing things up, now waiting for results queue to drain 11792 1727096173.98124: waiting for pending results... 11792 1727096173.98355: running TaskExecutor() for managed_node2/TASK: Asserts 11792 1727096173.98545: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008ef 11792 1727096173.98591: variable 'ansible_search_path' from source: unknown 11792 1727096173.98595: variable 'ansible_search_path' from source: unknown 11792 1727096173.98632: variable 'lsr_assert' from source: include params 11792 1727096173.98873: variable 'lsr_assert' from source: include params 11792 1727096173.99026: variable 'omit' from source: magic vars 11792 1727096173.99110: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096173.99125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096173.99141: variable 'omit' from source: magic vars 11792 1727096173.99387: variable 'ansible_distribution_major_version' from source: facts 11792 1727096173.99403: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096173.99414: variable 'item' from source: unknown 11792 1727096173.99496: variable 'item' from source: unknown 11792 1727096173.99535: variable 'item' from source: unknown 11792 1727096173.99609: variable 'item' from source: unknown 11792 1727096173.99960: dumping result to json 11792 1727096173.99963: done dumping result, returning 11792 1727096173.99966: done running TaskExecutor() for managed_node2/TASK: Asserts [0afff68d-5257-d9c7-3fc0-0000000008ef] 11792 1727096173.99970: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ef 11792 1727096174.00014: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008ef 11792 1727096174.00019: WORKER PROCESS EXITING 11792 1727096174.00044: no more pending results, returning what we have 11792 1727096174.00049: in VariableManager get_vars() 11792 1727096174.00110: Calling all_inventory to load vars for managed_node2 11792 1727096174.00114: Calling groups_inventory to load vars for managed_node2 11792 1727096174.00116: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096174.00132: Calling all_plugins_play to load vars for managed_node2 11792 1727096174.00135: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096174.00139: Calling groups_plugins_play to load vars for managed_node2 11792 1727096174.02099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096174.03841: done with get_vars() 11792 1727096174.03866: variable 'ansible_search_path' from source: unknown 11792 1727096174.03871: variable 'ansible_search_path' from source: unknown 11792 1727096174.03916: we have included files to process 11792 1727096174.03917: generating all_blocks data 11792 1727096174.03919: done generating all_blocks data 11792 1727096174.03925: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11792 1727096174.03926: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11792 1727096174.03929: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11792 1727096174.04201: in VariableManager get_vars() 11792 1727096174.04226: done with get_vars() 11792 1727096174.04274: in VariableManager get_vars() 11792 1727096174.04299: done with get_vars() 11792 1727096174.04313: done processing included file 11792 1727096174.04315: iterating over new_blocks loaded from include file 11792 1727096174.04317: in VariableManager get_vars() 11792 1727096174.04334: done with get_vars() 11792 1727096174.04336: filtering new block on tags 11792 1727096174.04388: done filtering new block on tags 11792 1727096174.04391: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node2 => (item=tasks/assert_bond_options.yml) 11792 1727096174.04396: extending task lists for all hosts with included blocks 11792 1727096174.07508: done extending task lists 11792 1727096174.07510: done processing included files 11792 1727096174.07511: results queue empty 11792 1727096174.07512: checking for any_errors_fatal 11792 1727096174.07518: done checking for any_errors_fatal 11792 1727096174.07519: checking for max_fail_percentage 11792 1727096174.07520: done checking for max_fail_percentage 11792 1727096174.07521: checking to see if all hosts have failed and the running result is not ok 11792 1727096174.07522: done checking to see if all hosts have failed 11792 1727096174.07522: getting the remaining hosts for this loop 11792 1727096174.07524: done getting the remaining hosts for this loop 11792 1727096174.07526: getting the next task for host managed_node2 11792 1727096174.07531: done getting next task for host managed_node2 11792 1727096174.07533: ^ task is: TASK: ** TEST check bond settings 11792 1727096174.07536: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096174.07539: getting variables 11792 1727096174.07540: in VariableManager get_vars() 11792 1727096174.07561: Calling all_inventory to load vars for managed_node2 11792 1727096174.07564: Calling groups_inventory to load vars for managed_node2 11792 1727096174.07566: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096174.07575: Calling all_plugins_play to load vars for managed_node2 11792 1727096174.07577: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096174.07580: Calling groups_plugins_play to load vars for managed_node2 11792 1727096174.08787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096174.10382: done with get_vars() 11792 1727096174.10414: done getting variables 11792 1727096174.10470: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Monday 23 September 2024 08:56:14 -0400 (0:00:00.128) 0:00:56.384 ****** 11792 1727096174.10504: entering _queue_task() for managed_node2/command 11792 1727096174.10896: worker is 1 (out of 1 available) 11792 1727096174.10910: exiting _queue_task() for managed_node2/command 11792 1727096174.10924: done queuing things up, now waiting for results queue to drain 11792 1727096174.10926: waiting for pending results... 11792 1727096174.11387: running TaskExecutor() for managed_node2/TASK: ** TEST check bond settings 11792 1727096174.11391: in run() - task 0afff68d-5257-d9c7-3fc0-000000000c2a 11792 1727096174.11394: variable 'ansible_search_path' from source: unknown 11792 1727096174.11397: variable 'ansible_search_path' from source: unknown 11792 1727096174.11418: variable 'bond_options_to_assert' from source: set_fact 11792 1727096174.11640: variable 'bond_options_to_assert' from source: set_fact 11792 1727096174.11770: variable 'omit' from source: magic vars 11792 1727096174.11919: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096174.11936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096174.12173: variable 'omit' from source: magic vars 11792 1727096174.12218: variable 'ansible_distribution_major_version' from source: facts 11792 1727096174.12235: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096174.12247: variable 'omit' from source: magic vars 11792 1727096174.12310: variable 'omit' from source: magic vars 11792 1727096174.12522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096174.15062: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096174.15150: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096174.15197: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096174.15236: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096174.15272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096174.15381: variable 'controller_device' from source: play vars 11792 1727096174.15391: variable 'bond_opt' from source: unknown 11792 1727096174.15420: variable 'omit' from source: magic vars 11792 1727096174.15469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096174.15503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096174.15529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096174.15557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096174.15574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096174.15606: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096174.15615: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096174.15622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096174.15732: Set connection var ansible_timeout to 10 11792 1727096174.15746: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096174.15766: Set connection var ansible_shell_executable to /bin/sh 11792 1727096174.15778: Set connection var ansible_pipelining to False 11792 1727096174.15784: Set connection var ansible_shell_type to sh 11792 1727096174.15874: Set connection var ansible_connection to ssh 11792 1727096174.15877: variable 'ansible_shell_executable' from source: unknown 11792 1727096174.15879: variable 'ansible_connection' from source: unknown 11792 1727096174.15881: variable 'ansible_module_compression' from source: unknown 11792 1727096174.15883: variable 'ansible_shell_type' from source: unknown 11792 1727096174.15885: variable 'ansible_shell_executable' from source: unknown 11792 1727096174.15887: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096174.15889: variable 'ansible_pipelining' from source: unknown 11792 1727096174.15891: variable 'ansible_timeout' from source: unknown 11792 1727096174.15893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096174.15973: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096174.15992: variable 'omit' from source: magic vars 11792 1727096174.16001: starting attempt loop 11792 1727096174.16008: running the handler 11792 1727096174.16027: _low_level_execute_command(): starting 11792 1727096174.16036: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096174.16762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.16858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.16884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.18819: stdout chunk (state=3): >>>/root <<< 11792 1727096174.19061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.19065: stdout chunk (state=3): >>><<< 11792 1727096174.19069: stderr chunk (state=3): >>><<< 11792 1727096174.19072: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096174.19075: _low_level_execute_command(): starting 11792 1727096174.19077: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206 `" && echo ansible-tmp-1727096174.190009-14428-221662462587206="` echo /root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206 `" ) && sleep 0' 11792 1727096174.19698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096174.19707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096174.19717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096174.19732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096174.19749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096174.19752: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096174.19764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.19782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096174.19790: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096174.19797: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096174.19806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096174.19815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096174.19827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096174.19835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096174.19843: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096174.19856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.19928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096174.19940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.19966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.20039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.22093: stdout chunk (state=3): >>>ansible-tmp-1727096174.190009-14428-221662462587206=/root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206 <<< 11792 1727096174.22269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.22273: stdout chunk (state=3): >>><<< 11792 1727096174.22276: stderr chunk (state=3): >>><<< 11792 1727096174.22374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096174.190009-14428-221662462587206=/root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096174.22378: variable 'ansible_module_compression' from source: unknown 11792 1727096174.22406: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096174.22448: variable 'ansible_facts' from source: unknown 11792 1727096174.22555: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/AnsiballZ_command.py 11792 1727096174.22797: Sending initial data 11792 1727096174.22800: Sent initial data (155 bytes) 11792 1727096174.23481: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096174.23571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096174.23586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.23628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096174.23657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.23683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.23813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.25483: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096174.25507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096174.25545: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpx0ocof2l /root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/AnsiballZ_command.py <<< 11792 1727096174.25549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/AnsiballZ_command.py" <<< 11792 1727096174.25576: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpx0ocof2l" to remote "/root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/AnsiballZ_command.py" <<< 11792 1727096174.27184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.27274: stderr chunk (state=3): >>><<< 11792 1727096174.27283: stdout chunk (state=3): >>><<< 11792 1727096174.27311: done transferring module to remote 11792 1727096174.27377: _low_level_execute_command(): starting 11792 1727096174.27392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/ /root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/AnsiballZ_command.py && sleep 0' 11792 1727096174.28621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096174.28638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.28791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096174.28803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.28884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.30869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.30977: stderr chunk (state=3): >>><<< 11792 1727096174.30980: stdout chunk (state=3): >>><<< 11792 1727096174.30983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096174.30986: _low_level_execute_command(): starting 11792 1727096174.30989: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/AnsiballZ_command.py && sleep 0' 11792 1727096174.32284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096174.32288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.32388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096174.32442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.32458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096174.32497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.32687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.49217: stdout chunk (state=3): >>> {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-23 08:56:14.487622", "end": "2024-09-23 08:56:14.490805", "delta": "0:00:00.003183", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096174.51081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.51085: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 11792 1727096174.51087: stderr chunk (state=3): >>><<< 11792 1727096174.51089: stdout chunk (state=3): >>><<< 11792 1727096174.51092: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-23 08:56:14.487622", "end": "2024-09-23 08:56:14.490805", "delta": "0:00:00.003183", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096174.51094: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096174.51101: _low_level_execute_command(): starting 11792 1727096174.51103: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096174.190009-14428-221662462587206/ > /dev/null 2>&1 && sleep 0' 11792 1727096174.51602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096174.51609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096174.51620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096174.51635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096174.51647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096174.51656: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096174.51663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.51680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096174.51687: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096174.51770: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096174.51785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.51797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.51858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.53875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.53879: stderr chunk (state=3): >>><<< 11792 1727096174.53881: stdout chunk (state=3): >>><<< 11792 1727096174.53884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096174.53886: handler run complete 11792 1727096174.53911: Evaluated conditional (False): False 11792 1727096174.54077: variable 'bond_opt' from source: unknown 11792 1727096174.54083: variable 'result' from source: set_fact 11792 1727096174.54099: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096174.54110: attempt loop complete, returning result 11792 1727096174.54128: variable 'bond_opt' from source: unknown 11792 1727096174.54202: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'mode', 'value': 'active-backup'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "active-backup" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003183", "end": "2024-09-23 08:56:14.490805", "rc": 0, "start": "2024-09-23 08:56:14.487622" } STDOUT: active-backup 1 11792 1727096174.54407: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096174.54412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096174.54415: variable 'omit' from source: magic vars 11792 1727096174.54673: variable 'ansible_distribution_major_version' from source: facts 11792 1727096174.54676: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096174.54679: variable 'omit' from source: magic vars 11792 1727096174.54681: variable 'omit' from source: magic vars 11792 1727096174.54873: variable 'controller_device' from source: play vars 11792 1727096174.54876: variable 'bond_opt' from source: unknown 11792 1727096174.54879: variable 'omit' from source: magic vars 11792 1727096174.54881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096174.54884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096174.54886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096174.54888: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096174.54890: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096174.54892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096174.54916: Set connection var ansible_timeout to 10 11792 1727096174.54923: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096174.54932: Set connection var ansible_shell_executable to /bin/sh 11792 1727096174.54945: Set connection var ansible_pipelining to False 11792 1727096174.54948: Set connection var ansible_shell_type to sh 11792 1727096174.54950: Set connection var ansible_connection to ssh 11792 1727096174.54977: variable 'ansible_shell_executable' from source: unknown 11792 1727096174.54980: variable 'ansible_connection' from source: unknown 11792 1727096174.54983: variable 'ansible_module_compression' from source: unknown 11792 1727096174.54985: variable 'ansible_shell_type' from source: unknown 11792 1727096174.54987: variable 'ansible_shell_executable' from source: unknown 11792 1727096174.54989: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096174.54994: variable 'ansible_pipelining' from source: unknown 11792 1727096174.54996: variable 'ansible_timeout' from source: unknown 11792 1727096174.55000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096174.55104: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096174.55113: variable 'omit' from source: magic vars 11792 1727096174.55116: starting attempt loop 11792 1727096174.55119: running the handler 11792 1727096174.55128: _low_level_execute_command(): starting 11792 1727096174.55131: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096174.55771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096174.55780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096174.55790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096174.55803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096174.55838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096174.55846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096174.55922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.56000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.56003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.57706: stdout chunk (state=3): >>>/root <<< 11792 1727096174.57873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.57877: stdout chunk (state=3): >>><<< 11792 1727096174.57879: stderr chunk (state=3): >>><<< 11792 1727096174.57897: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096174.57979: _low_level_execute_command(): starting 11792 1727096174.57982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596 `" && echo ansible-tmp-1727096174.579048-14428-260846489780596="` echo /root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596 `" ) && sleep 0' 11792 1727096174.58675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096174.58791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.58861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.58922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.61230: stdout chunk (state=3): >>>ansible-tmp-1727096174.579048-14428-260846489780596=/root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596 <<< 11792 1727096174.61475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.61479: stdout chunk (state=3): >>><<< 11792 1727096174.61481: stderr chunk (state=3): >>><<< 11792 1727096174.61483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096174.579048-14428-260846489780596=/root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096174.61486: variable 'ansible_module_compression' from source: unknown 11792 1727096174.61488: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096174.61514: variable 'ansible_facts' from source: unknown 11792 1727096174.61595: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/AnsiballZ_command.py 11792 1727096174.61733: Sending initial data 11792 1727096174.61849: Sent initial data (155 bytes) 11792 1727096174.62443: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096174.62484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096174.62541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096174.62557: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.62677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.62708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.62816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.64501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096174.64805: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096174.64810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpieyfecce" to remote "/root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/AnsiballZ_command.py" <<< 11792 1727096174.64813: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpieyfecce /root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/AnsiballZ_command.py <<< 11792 1727096174.65348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.65408: stderr chunk (state=3): >>><<< 11792 1727096174.65423: stdout chunk (state=3): >>><<< 11792 1727096174.65467: done transferring module to remote 11792 1727096174.65489: _low_level_execute_command(): starting 11792 1727096174.65499: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/ /root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/AnsiballZ_command.py && sleep 0' 11792 1727096174.66102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096174.66116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096174.66126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096174.66192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096174.66195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.66198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.66360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096174.68385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096174.68389: stdout chunk (state=3): >>><<< 11792 1727096174.68391: stderr chunk (state=3): >>><<< 11792 1727096174.68474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096174.68478: _low_level_execute_command(): starting 11792 1727096174.68480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/AnsiballZ_command.py && sleep 0' 11792 1727096174.69888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096174.70294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096174.70403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096174.70650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096175.87829: stdout chunk (state=3): >>> {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-23 08:56:14.872621", "end": "2024-09-23 08:56:15.877027", "delta": "0:00:01.004406", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096175.89738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096175.89770: stderr chunk (state=3): >>><<< 11792 1727096175.89774: stdout chunk (state=3): >>><<< 11792 1727096175.89793: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-23 08:56:14.872621", "end": "2024-09-23 08:56:15.877027", "delta": "0:00:01.004406", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096175.89815: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096175.89821: _low_level_execute_command(): starting 11792 1727096175.89826: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096174.579048-14428-260846489780596/ > /dev/null 2>&1 && sleep 0' 11792 1727096175.90255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096175.90287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096175.90290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096175.90293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096175.90295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096175.90337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096175.90357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096175.90395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096175.92683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096175.92710: stderr chunk (state=3): >>><<< 11792 1727096175.92714: stdout chunk (state=3): >>><<< 11792 1727096175.92727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096175.92733: handler run complete 11792 1727096175.92748: Evaluated conditional (False): False 11792 1727096175.92860: variable 'bond_opt' from source: unknown 11792 1727096175.92863: variable 'result' from source: set_fact 11792 1727096175.92879: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096175.92888: attempt loop complete, returning result 11792 1727096175.92909: variable 'bond_opt' from source: unknown 11792 1727096175.92956: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'arp_interval', 'value': '60'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_interval", "value": "60" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_interval" ], "delta": "0:00:01.004406", "end": "2024-09-23 08:56:15.877027", "rc": 0, "start": "2024-09-23 08:56:14.872621" } STDOUT: 60 11792 1727096175.93090: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096175.93093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096175.93095: variable 'omit' from source: magic vars 11792 1727096175.93196: variable 'ansible_distribution_major_version' from source: facts 11792 1727096175.93199: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096175.93211: variable 'omit' from source: magic vars 11792 1727096175.93219: variable 'omit' from source: magic vars 11792 1727096175.93326: variable 'controller_device' from source: play vars 11792 1727096175.93330: variable 'bond_opt' from source: unknown 11792 1727096175.93344: variable 'omit' from source: magic vars 11792 1727096175.93360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096175.93374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096175.93377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096175.93388: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096175.93391: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096175.93393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096175.93446: Set connection var ansible_timeout to 10 11792 1727096175.93452: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096175.93461: Set connection var ansible_shell_executable to /bin/sh 11792 1727096175.93465: Set connection var ansible_pipelining to False 11792 1727096175.93469: Set connection var ansible_shell_type to sh 11792 1727096175.93472: Set connection var ansible_connection to ssh 11792 1727096175.93485: variable 'ansible_shell_executable' from source: unknown 11792 1727096175.93488: variable 'ansible_connection' from source: unknown 11792 1727096175.93490: variable 'ansible_module_compression' from source: unknown 11792 1727096175.93492: variable 'ansible_shell_type' from source: unknown 11792 1727096175.93495: variable 'ansible_shell_executable' from source: unknown 11792 1727096175.93497: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096175.93501: variable 'ansible_pipelining' from source: unknown 11792 1727096175.93504: variable 'ansible_timeout' from source: unknown 11792 1727096175.93508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096175.93577: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096175.93583: variable 'omit' from source: magic vars 11792 1727096175.93586: starting attempt loop 11792 1727096175.93589: running the handler 11792 1727096175.93595: _low_level_execute_command(): starting 11792 1727096175.93597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096175.94045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096175.94048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096175.94050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096175.94055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096175.94057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096175.94100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096175.94112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096175.94155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096175.95805: stdout chunk (state=3): >>>/root <<< 11792 1727096175.95909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096175.95933: stderr chunk (state=3): >>><<< 11792 1727096175.95936: stdout chunk (state=3): >>><<< 11792 1727096175.95950: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096175.95959: _low_level_execute_command(): starting 11792 1727096175.95965: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539 `" && echo ansible-tmp-1727096175.9595022-14428-181764583632539="` echo /root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539 `" ) && sleep 0' 11792 1727096175.96423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096175.96426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096175.96429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096175.96431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096175.96433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096175.96490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096175.96493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096175.96500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096175.96534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096175.98518: stdout chunk (state=3): >>>ansible-tmp-1727096175.9595022-14428-181764583632539=/root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539 <<< 11792 1727096175.98618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096175.98649: stderr chunk (state=3): >>><<< 11792 1727096175.98655: stdout chunk (state=3): >>><<< 11792 1727096175.98671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096175.9595022-14428-181764583632539=/root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096175.98691: variable 'ansible_module_compression' from source: unknown 11792 1727096175.98725: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096175.98740: variable 'ansible_facts' from source: unknown 11792 1727096175.98785: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/AnsiballZ_command.py 11792 1727096175.98886: Sending initial data 11792 1727096175.98892: Sent initial data (156 bytes) 11792 1727096175.99325: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096175.99360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096175.99364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096175.99367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096175.99370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096175.99372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096175.99374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096175.99422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096175.99425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096175.99427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096175.99470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.01122: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096176.01146: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096176.01184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpk8johp05 /root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/AnsiballZ_command.py <<< 11792 1727096176.01186: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/AnsiballZ_command.py" <<< 11792 1727096176.01215: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpk8johp05" to remote "/root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/AnsiballZ_command.py" <<< 11792 1727096176.01218: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/AnsiballZ_command.py" <<< 11792 1727096176.01705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.01751: stderr chunk (state=3): >>><<< 11792 1727096176.01757: stdout chunk (state=3): >>><<< 11792 1727096176.01783: done transferring module to remote 11792 1727096176.01790: _low_level_execute_command(): starting 11792 1727096176.01795: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/ /root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/AnsiballZ_command.py && sleep 0' 11792 1727096176.02234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.02242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096176.02260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.02264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.02277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.02336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.02343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.02345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.02435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.04458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.04489: stderr chunk (state=3): >>><<< 11792 1727096176.04492: stdout chunk (state=3): >>><<< 11792 1727096176.04506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.04509: _low_level_execute_command(): starting 11792 1727096176.04514: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/AnsiballZ_command.py && sleep 0' 11792 1727096176.04947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.04956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096176.04973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.04985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.05048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.05051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.05053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.05096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.21884: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-23 08:56:16.213622", "end": "2024-09-23 08:56:16.216978", "delta": "0:00:00.003356", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096176.24575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096176.24603: stderr chunk (state=3): >>><<< 11792 1727096176.24606: stdout chunk (state=3): >>><<< 11792 1727096176.24621: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-23 08:56:16.213622", "end": "2024-09-23 08:56:16.216978", "delta": "0:00:00.003356", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096176.24649: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_ip_target', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096176.24652: _low_level_execute_command(): starting 11792 1727096176.24660: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096175.9595022-14428-181764583632539/ > /dev/null 2>&1 && sleep 0' 11792 1727096176.25475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.25517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.25628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.27429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.27478: stderr chunk (state=3): >>><<< 11792 1727096176.27489: stdout chunk (state=3): >>><<< 11792 1727096176.27514: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.27525: handler run complete 11792 1727096176.27548: Evaluated conditional (False): False 11792 1727096176.27685: variable 'bond_opt' from source: unknown 11792 1727096176.27774: variable 'result' from source: set_fact 11792 1727096176.27778: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096176.27780: attempt loop complete, returning result 11792 1727096176.27785: variable 'bond_opt' from source: unknown 11792 1727096176.27823: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'arp_ip_target', 'value': '192.0.2.128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_ip_target", "value": "192.0.2.128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_ip_target" ], "delta": "0:00:00.003356", "end": "2024-09-23 08:56:16.216978", "rc": 0, "start": "2024-09-23 08:56:16.213622" } STDOUT: 192.0.2.128 11792 1727096176.28185: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096176.28188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096176.28191: variable 'omit' from source: magic vars 11792 1727096176.28243: variable 'ansible_distribution_major_version' from source: facts 11792 1727096176.28254: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096176.28263: variable 'omit' from source: magic vars 11792 1727096176.28286: variable 'omit' from source: magic vars 11792 1727096176.28457: variable 'controller_device' from source: play vars 11792 1727096176.28470: variable 'bond_opt' from source: unknown 11792 1727096176.28494: variable 'omit' from source: magic vars 11792 1727096176.28525: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096176.28539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096176.28572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096176.28575: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096176.28578: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096176.28585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096176.28675: Set connection var ansible_timeout to 10 11792 1727096176.28743: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096176.28746: Set connection var ansible_shell_executable to /bin/sh 11792 1727096176.28749: Set connection var ansible_pipelining to False 11792 1727096176.28751: Set connection var ansible_shell_type to sh 11792 1727096176.28753: Set connection var ansible_connection to ssh 11792 1727096176.28755: variable 'ansible_shell_executable' from source: unknown 11792 1727096176.28757: variable 'ansible_connection' from source: unknown 11792 1727096176.28758: variable 'ansible_module_compression' from source: unknown 11792 1727096176.28760: variable 'ansible_shell_type' from source: unknown 11792 1727096176.28762: variable 'ansible_shell_executable' from source: unknown 11792 1727096176.28764: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096176.28774: variable 'ansible_pipelining' from source: unknown 11792 1727096176.28782: variable 'ansible_timeout' from source: unknown 11792 1727096176.28790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096176.28899: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096176.28915: variable 'omit' from source: magic vars 11792 1727096176.28924: starting attempt loop 11792 1727096176.28961: running the handler 11792 1727096176.28964: _low_level_execute_command(): starting 11792 1727096176.28966: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096176.29584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096176.29599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.29688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.29716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.29733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.29837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.29890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.31972: stdout chunk (state=3): >>>/root <<< 11792 1727096176.32172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.32177: stdout chunk (state=3): >>><<< 11792 1727096176.32179: stderr chunk (state=3): >>><<< 11792 1727096176.32287: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.32290: _low_level_execute_command(): starting 11792 1727096176.32293: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004 `" && echo ansible-tmp-1727096176.322001-14428-107168134430004="` echo /root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004 `" ) && sleep 0' 11792 1727096176.32855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096176.32891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.32974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096176.32993: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.33035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.33051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.33081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.33166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.35175: stdout chunk (state=3): >>>ansible-tmp-1727096176.322001-14428-107168134430004=/root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004 <<< 11792 1727096176.35337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.35341: stdout chunk (state=3): >>><<< 11792 1727096176.35343: stderr chunk (state=3): >>><<< 11792 1727096176.35359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096176.322001-14428-107168134430004=/root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.35575: variable 'ansible_module_compression' from source: unknown 11792 1727096176.35579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096176.35581: variable 'ansible_facts' from source: unknown 11792 1727096176.35583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/AnsiballZ_command.py 11792 1727096176.35714: Sending initial data 11792 1727096176.35723: Sent initial data (155 bytes) 11792 1727096176.36313: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096176.36329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.36342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.36363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096176.36381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096176.36391: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096176.36472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.36486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.36500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.36520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.36598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.38461: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096176.38514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096176.38541: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/AnsiballZ_command.py" <<< 11792 1727096176.38545: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpe3b56xty /root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/AnsiballZ_command.py <<< 11792 1727096176.38595: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpe3b56xty" to remote "/root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/AnsiballZ_command.py" <<< 11792 1727096176.39477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.39481: stdout chunk (state=3): >>><<< 11792 1727096176.39483: stderr chunk (state=3): >>><<< 11792 1727096176.39486: done transferring module to remote 11792 1727096176.39488: _low_level_execute_command(): starting 11792 1727096176.39490: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/ /root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/AnsiballZ_command.py && sleep 0' 11792 1727096176.40153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.40199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.40215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.40237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.40314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.42471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.42475: stdout chunk (state=3): >>><<< 11792 1727096176.42485: stderr chunk (state=3): >>><<< 11792 1727096176.42504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.42507: _low_level_execute_command(): starting 11792 1727096176.42510: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/AnsiballZ_command.py && sleep 0' 11792 1727096176.43166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096176.43177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.43188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.43208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096176.43212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096176.43373: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096176.43376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.43384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.43387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.43389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.43442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.59908: stdout chunk (state=3): >>> {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-23 08:56:16.594472", "end": "2024-09-23 08:56:16.597640", "delta": "0:00:00.003168", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096176.61597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096176.61627: stderr chunk (state=3): >>><<< 11792 1727096176.61630: stdout chunk (state=3): >>><<< 11792 1727096176.61641: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-23 08:56:16.594472", "end": "2024-09-23 08:56:16.597640", "delta": "0:00:00.003168", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096176.61676: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_validate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096176.61679: _low_level_execute_command(): starting 11792 1727096176.61681: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096176.322001-14428-107168134430004/ > /dev/null 2>&1 && sleep 0' 11792 1727096176.62151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.62157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.62164: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.62169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.62229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.62236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.62253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.62311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.64502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.64530: stderr chunk (state=3): >>><<< 11792 1727096176.64533: stdout chunk (state=3): >>><<< 11792 1727096176.64549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.64555: handler run complete 11792 1727096176.64572: Evaluated conditional (False): False 11792 1727096176.64681: variable 'bond_opt' from source: unknown 11792 1727096176.64685: variable 'result' from source: set_fact 11792 1727096176.64696: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096176.64705: attempt loop complete, returning result 11792 1727096176.64719: variable 'bond_opt' from source: unknown 11792 1727096176.64770: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'arp_validate', 'value': 'none'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_validate", "value": "none" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_validate" ], "delta": "0:00:00.003168", "end": "2024-09-23 08:56:16.597640", "rc": 0, "start": "2024-09-23 08:56:16.594472" } STDOUT: none 0 11792 1727096176.64898: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096176.64903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096176.64905: variable 'omit' from source: magic vars 11792 1727096176.64988: variable 'ansible_distribution_major_version' from source: facts 11792 1727096176.64992: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096176.64995: variable 'omit' from source: magic vars 11792 1727096176.65007: variable 'omit' from source: magic vars 11792 1727096176.65114: variable 'controller_device' from source: play vars 11792 1727096176.65118: variable 'bond_opt' from source: unknown 11792 1727096176.65134: variable 'omit' from source: magic vars 11792 1727096176.65151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096176.65158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096176.65164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096176.65176: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096176.65179: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096176.65181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096176.65227: Set connection var ansible_timeout to 10 11792 1727096176.65235: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096176.65244: Set connection var ansible_shell_executable to /bin/sh 11792 1727096176.65247: Set connection var ansible_pipelining to False 11792 1727096176.65249: Set connection var ansible_shell_type to sh 11792 1727096176.65251: Set connection var ansible_connection to ssh 11792 1727096176.65268: variable 'ansible_shell_executable' from source: unknown 11792 1727096176.65271: variable 'ansible_connection' from source: unknown 11792 1727096176.65273: variable 'ansible_module_compression' from source: unknown 11792 1727096176.65276: variable 'ansible_shell_type' from source: unknown 11792 1727096176.65278: variable 'ansible_shell_executable' from source: unknown 11792 1727096176.65280: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096176.65284: variable 'ansible_pipelining' from source: unknown 11792 1727096176.65287: variable 'ansible_timeout' from source: unknown 11792 1727096176.65291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096176.65355: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096176.65359: variable 'omit' from source: magic vars 11792 1727096176.65361: starting attempt loop 11792 1727096176.65364: running the handler 11792 1727096176.65373: _low_level_execute_command(): starting 11792 1727096176.65376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096176.66051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096176.66080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.66103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.66177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.67890: stdout chunk (state=3): >>>/root <<< 11792 1727096176.67981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.68011: stderr chunk (state=3): >>><<< 11792 1727096176.68014: stdout chunk (state=3): >>><<< 11792 1727096176.68031: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.68041: _low_level_execute_command(): starting 11792 1727096176.68046: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831 `" && echo ansible-tmp-1727096176.6803126-14428-156776745678831="` echo /root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831 `" ) && sleep 0' 11792 1727096176.68518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.68521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096176.68524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.68526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.68528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.68585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.68589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.68595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.68632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.70637: stdout chunk (state=3): >>>ansible-tmp-1727096176.6803126-14428-156776745678831=/root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831 <<< 11792 1727096176.70741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.70781: stderr chunk (state=3): >>><<< 11792 1727096176.70785: stdout chunk (state=3): >>><<< 11792 1727096176.70799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096176.6803126-14428-156776745678831=/root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.70817: variable 'ansible_module_compression' from source: unknown 11792 1727096176.70846: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096176.70869: variable 'ansible_facts' from source: unknown 11792 1727096176.70912: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/AnsiballZ_command.py 11792 1727096176.71007: Sending initial data 11792 1727096176.71010: Sent initial data (156 bytes) 11792 1727096176.71441: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.71473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096176.71476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096176.71478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.71480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.71482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.71529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.71532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.71541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.71591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.73253: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096176.73286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096176.73316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp5vcb_1l7 /root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/AnsiballZ_command.py <<< 11792 1727096176.73327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/AnsiballZ_command.py" <<< 11792 1727096176.73347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp5vcb_1l7" to remote "/root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/AnsiballZ_command.py" <<< 11792 1727096176.73357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/AnsiballZ_command.py" <<< 11792 1727096176.73837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.73884: stderr chunk (state=3): >>><<< 11792 1727096176.73887: stdout chunk (state=3): >>><<< 11792 1727096176.73924: done transferring module to remote 11792 1727096176.73932: _low_level_execute_command(): starting 11792 1727096176.73937: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/ /root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/AnsiballZ_command.py && sleep 0' 11792 1727096176.74402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.74406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.74408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.74410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096176.74412: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.74465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.74479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.74490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.74514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.76426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.76441: stderr chunk (state=3): >>><<< 11792 1727096176.76444: stdout chunk (state=3): >>><<< 11792 1727096176.76460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.76463: _low_level_execute_command(): starting 11792 1727096176.76469: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/AnsiballZ_command.py && sleep 0' 11792 1727096176.76943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096176.76947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096176.76950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096176.76952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.77007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.77010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.77013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.77063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.93908: stdout chunk (state=3): >>> {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-23 08:56:16.934200", "end": "2024-09-23 08:56:16.937486", "delta": "0:00:00.003286", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096176.95675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096176.95680: stdout chunk (state=3): >>><<< 11792 1727096176.95682: stderr chunk (state=3): >>><<< 11792 1727096176.95812: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-23 08:56:16.934200", "end": "2024-09-23 08:56:16.937486", "delta": "0:00:00.003286", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096176.95821: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/primary', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096176.95825: _low_level_execute_command(): starting 11792 1727096176.95827: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096176.6803126-14428-156776745678831/ > /dev/null 2>&1 && sleep 0' 11792 1727096176.96259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096176.96266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.96287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096176.96290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096176.96345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096176.96349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096176.96354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096176.96392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096176.98381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096176.98385: stdout chunk (state=3): >>><<< 11792 1727096176.98387: stderr chunk (state=3): >>><<< 11792 1727096176.98409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096176.98440: handler run complete 11792 1727096176.98472: Evaluated conditional (False): False 11792 1727096176.98613: variable 'bond_opt' from source: unknown 11792 1727096176.98616: variable 'result' from source: set_fact 11792 1727096176.98627: Evaluated conditional (bond_opt.value in result.stdout): True 11792 1727096176.98641: attempt loop complete, returning result 11792 1727096176.98660: variable 'bond_opt' from source: unknown 11792 1727096176.98712: variable 'bond_opt' from source: unknown ok: [managed_node2] => (item={'key': 'primary', 'value': 'test1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "primary", "value": "test1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/primary" ], "delta": "0:00:00.003286", "end": "2024-09-23 08:56:16.937486", "rc": 0, "start": "2024-09-23 08:56:16.934200" } STDOUT: test1 11792 1727096176.98840: dumping result to json 11792 1727096176.98843: done dumping result, returning 11792 1727096176.98846: done running TaskExecutor() for managed_node2/TASK: ** TEST check bond settings [0afff68d-5257-d9c7-3fc0-000000000c2a] 11792 1727096176.98848: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000c2a 11792 1727096176.98905: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000c2a 11792 1727096176.98908: WORKER PROCESS EXITING 11792 1727096176.99033: no more pending results, returning what we have 11792 1727096176.99037: results queue empty 11792 1727096176.99038: checking for any_errors_fatal 11792 1727096176.99040: done checking for any_errors_fatal 11792 1727096176.99040: checking for max_fail_percentage 11792 1727096176.99042: done checking for max_fail_percentage 11792 1727096176.99042: checking to see if all hosts have failed and the running result is not ok 11792 1727096176.99043: done checking to see if all hosts have failed 11792 1727096176.99044: getting the remaining hosts for this loop 11792 1727096176.99045: done getting the remaining hosts for this loop 11792 1727096176.99048: getting the next task for host managed_node2 11792 1727096176.99055: done getting next task for host managed_node2 11792 1727096176.99057: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 11792 1727096176.99060: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096176.99064: getting variables 11792 1727096176.99065: in VariableManager get_vars() 11792 1727096176.99113: Calling all_inventory to load vars for managed_node2 11792 1727096176.99116: Calling groups_inventory to load vars for managed_node2 11792 1727096176.99118: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096176.99128: Calling all_plugins_play to load vars for managed_node2 11792 1727096176.99130: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096176.99133: Calling groups_plugins_play to load vars for managed_node2 11792 1727096177.00035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096177.01279: done with get_vars() 11792 1727096177.01313: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Monday 23 September 2024 08:56:17 -0400 (0:00:02.909) 0:00:59.293 ****** 11792 1727096177.01426: entering _queue_task() for managed_node2/include_tasks 11792 1727096177.01808: worker is 1 (out of 1 available) 11792 1727096177.01822: exiting _queue_task() for managed_node2/include_tasks 11792 1727096177.01836: done queuing things up, now waiting for results queue to drain 11792 1727096177.01837: waiting for pending results... 11792 1727096177.02397: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv4_present.yml' 11792 1727096177.02403: in run() - task 0afff68d-5257-d9c7-3fc0-000000000c2c 11792 1727096177.02407: variable 'ansible_search_path' from source: unknown 11792 1727096177.02409: variable 'ansible_search_path' from source: unknown 11792 1727096177.02412: calling self._execute() 11792 1727096177.02602: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096177.02606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096177.02609: variable 'omit' from source: magic vars 11792 1727096177.02903: variable 'ansible_distribution_major_version' from source: facts 11792 1727096177.02924: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096177.02937: _execute() done 11792 1727096177.02945: dumping result to json 11792 1727096177.02955: done dumping result, returning 11792 1727096177.02966: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv4_present.yml' [0afff68d-5257-d9c7-3fc0-000000000c2c] 11792 1727096177.02979: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000c2c 11792 1727096177.03227: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000c2c 11792 1727096177.03230: WORKER PROCESS EXITING 11792 1727096177.03263: no more pending results, returning what we have 11792 1727096177.03271: in VariableManager get_vars() 11792 1727096177.03327: Calling all_inventory to load vars for managed_node2 11792 1727096177.03330: Calling groups_inventory to load vars for managed_node2 11792 1727096177.03333: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096177.03348: Calling all_plugins_play to load vars for managed_node2 11792 1727096177.03352: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096177.03358: Calling groups_plugins_play to load vars for managed_node2 11792 1727096177.04921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096177.06714: done with get_vars() 11792 1727096177.06740: variable 'ansible_search_path' from source: unknown 11792 1727096177.06742: variable 'ansible_search_path' from source: unknown 11792 1727096177.06756: variable 'item' from source: include params 11792 1727096177.06875: variable 'item' from source: include params 11792 1727096177.06912: we have included files to process 11792 1727096177.06914: generating all_blocks data 11792 1727096177.06915: done generating all_blocks data 11792 1727096177.06922: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11792 1727096177.06923: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11792 1727096177.06926: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11792 1727096177.07123: done processing included file 11792 1727096177.07125: iterating over new_blocks loaded from include file 11792 1727096177.07126: in VariableManager get_vars() 11792 1727096177.07151: done with get_vars() 11792 1727096177.07155: filtering new block on tags 11792 1727096177.07185: done filtering new block on tags 11792 1727096177.07188: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node2 11792 1727096177.07193: extending task lists for all hosts with included blocks 11792 1727096177.07406: done extending task lists 11792 1727096177.07408: done processing included files 11792 1727096177.07409: results queue empty 11792 1727096177.07409: checking for any_errors_fatal 11792 1727096177.07418: done checking for any_errors_fatal 11792 1727096177.07419: checking for max_fail_percentage 11792 1727096177.07420: done checking for max_fail_percentage 11792 1727096177.07421: checking to see if all hosts have failed and the running result is not ok 11792 1727096177.07421: done checking to see if all hosts have failed 11792 1727096177.07422: getting the remaining hosts for this loop 11792 1727096177.07423: done getting the remaining hosts for this loop 11792 1727096177.07426: getting the next task for host managed_node2 11792 1727096177.07431: done getting next task for host managed_node2 11792 1727096177.07433: ^ task is: TASK: ** TEST check IPv4 11792 1727096177.07436: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096177.07439: getting variables 11792 1727096177.07440: in VariableManager get_vars() 11792 1727096177.07457: Calling all_inventory to load vars for managed_node2 11792 1727096177.07460: Calling groups_inventory to load vars for managed_node2 11792 1727096177.07462: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096177.07470: Calling all_plugins_play to load vars for managed_node2 11792 1727096177.07472: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096177.07475: Calling groups_plugins_play to load vars for managed_node2 11792 1727096177.08693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096177.10308: done with get_vars() 11792 1727096177.10340: done getting variables 11792 1727096177.10392: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Monday 23 September 2024 08:56:17 -0400 (0:00:00.089) 0:00:59.383 ****** 11792 1727096177.10427: entering _queue_task() for managed_node2/command 11792 1727096177.10809: worker is 1 (out of 1 available) 11792 1727096177.10823: exiting _queue_task() for managed_node2/command 11792 1727096177.10837: done queuing things up, now waiting for results queue to drain 11792 1727096177.10839: waiting for pending results... 11792 1727096177.11143: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 11792 1727096177.11475: in run() - task 0afff68d-5257-d9c7-3fc0-000000000da6 11792 1727096177.11479: variable 'ansible_search_path' from source: unknown 11792 1727096177.11482: variable 'ansible_search_path' from source: unknown 11792 1727096177.11485: calling self._execute() 11792 1727096177.11487: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096177.11490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096177.11492: variable 'omit' from source: magic vars 11792 1727096177.11871: variable 'ansible_distribution_major_version' from source: facts 11792 1727096177.11889: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096177.11900: variable 'omit' from source: magic vars 11792 1727096177.11965: variable 'omit' from source: magic vars 11792 1727096177.12136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096177.14538: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096177.14614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096177.14670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096177.14710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096177.14739: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096177.14883: variable 'interface' from source: include params 11792 1727096177.15074: variable 'controller_device' from source: play vars 11792 1727096177.15077: variable 'controller_device' from source: play vars 11792 1727096177.15080: variable 'omit' from source: magic vars 11792 1727096177.15373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096177.15377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096177.15396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096177.15425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096177.15442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096177.15545: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096177.15558: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096177.15566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096177.15842: Set connection var ansible_timeout to 10 11792 1727096177.15848: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096177.15870: Set connection var ansible_shell_executable to /bin/sh 11792 1727096177.15882: Set connection var ansible_pipelining to False 11792 1727096177.15889: Set connection var ansible_shell_type to sh 11792 1727096177.15895: Set connection var ansible_connection to ssh 11792 1727096177.15925: variable 'ansible_shell_executable' from source: unknown 11792 1727096177.15935: variable 'ansible_connection' from source: unknown 11792 1727096177.15957: variable 'ansible_module_compression' from source: unknown 11792 1727096177.16021: variable 'ansible_shell_type' from source: unknown 11792 1727096177.16025: variable 'ansible_shell_executable' from source: unknown 11792 1727096177.16027: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096177.16029: variable 'ansible_pipelining' from source: unknown 11792 1727096177.16041: variable 'ansible_timeout' from source: unknown 11792 1727096177.16044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096177.16169: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096177.16180: variable 'omit' from source: magic vars 11792 1727096177.16185: starting attempt loop 11792 1727096177.16189: running the handler 11792 1727096177.16206: _low_level_execute_command(): starting 11792 1727096177.16212: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096177.17276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096177.17304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.17319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.17392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.19107: stdout chunk (state=3): >>>/root <<< 11792 1727096177.19275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.19279: stdout chunk (state=3): >>><<< 11792 1727096177.19282: stderr chunk (state=3): >>><<< 11792 1727096177.19345: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096177.19349: _low_level_execute_command(): starting 11792 1727096177.19352: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708 `" && echo ansible-tmp-1727096177.1931162-14553-17989284200708="` echo /root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708 `" ) && sleep 0' 11792 1727096177.20088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096177.20182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.20219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096177.20235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.20258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.20329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.22339: stdout chunk (state=3): >>>ansible-tmp-1727096177.1931162-14553-17989284200708=/root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708 <<< 11792 1727096177.22494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.22497: stdout chunk (state=3): >>><<< 11792 1727096177.22573: stderr chunk (state=3): >>><<< 11792 1727096177.22577: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096177.1931162-14553-17989284200708=/root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096177.22580: variable 'ansible_module_compression' from source: unknown 11792 1727096177.22635: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096177.22680: variable 'ansible_facts' from source: unknown 11792 1727096177.22782: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/AnsiballZ_command.py 11792 1727096177.22944: Sending initial data 11792 1727096177.22955: Sent initial data (155 bytes) 11792 1727096177.23549: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096177.23588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.23672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096177.23677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096177.23691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.23702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.23760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.25476: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096177.25481: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11792 1727096177.25484: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11792 1727096177.25487: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096177.25527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096177.25594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp2itsohhf /root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/AnsiballZ_command.py <<< 11792 1727096177.25597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/AnsiballZ_command.py" <<< 11792 1727096177.25642: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp2itsohhf" to remote "/root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/AnsiballZ_command.py" <<< 11792 1727096177.26394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.26425: stdout chunk (state=3): >>><<< 11792 1727096177.26428: stderr chunk (state=3): >>><<< 11792 1727096177.26490: done transferring module to remote 11792 1727096177.26536: _low_level_execute_command(): starting 11792 1727096177.26540: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/ /root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/AnsiballZ_command.py && sleep 0' 11792 1727096177.27131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096177.27137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096177.27161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096177.27165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.27231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096177.27234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.27236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.27283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.29204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.29208: stdout chunk (state=3): >>><<< 11792 1727096177.29210: stderr chunk (state=3): >>><<< 11792 1727096177.29229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096177.29243: _low_level_execute_command(): starting 11792 1727096177.29321: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/AnsiballZ_command.py && sleep 0' 11792 1727096177.29829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096177.29851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096177.29855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.29908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.29916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.29977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.46646: stdout chunk (state=3): >>> <<< 11792 1727096177.46651: stdout chunk (state=3): >>>{"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.135/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 238sec preferred_lft 238sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-23 08:56:17.460865", "end": "2024-09-23 08:56:17.464715", "delta": "0:00:00.003850", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096177.48511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096177.48515: stdout chunk (state=3): >>><<< 11792 1727096177.48517: stderr chunk (state=3): >>><<< 11792 1727096177.48519: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.135/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 238sec preferred_lft 238sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-23 08:56:17.460865", "end": "2024-09-23 08:56:17.464715", "delta": "0:00:00.003850", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096177.48522: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096177.48529: _low_level_execute_command(): starting 11792 1727096177.48531: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096177.1931162-14553-17989284200708/ > /dev/null 2>&1 && sleep 0' 11792 1727096177.49063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.49111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096177.49124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.49180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.51156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.51160: stdout chunk (state=3): >>><<< 11792 1727096177.51162: stderr chunk (state=3): >>><<< 11792 1727096177.51378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096177.51382: handler run complete 11792 1727096177.51384: Evaluated conditional (False): False 11792 1727096177.51386: variable 'address' from source: include params 11792 1727096177.51388: variable 'result' from source: set_fact 11792 1727096177.51408: Evaluated conditional (address in result.stdout): True 11792 1727096177.51424: attempt loop complete, returning result 11792 1727096177.51431: _execute() done 11792 1727096177.51437: dumping result to json 11792 1727096177.51446: done dumping result, returning 11792 1727096177.51457: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 [0afff68d-5257-d9c7-3fc0-000000000da6] 11792 1727096177.51465: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000da6 ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003850", "end": "2024-09-23 08:56:17.464715", "rc": 0, "start": "2024-09-23 08:56:17.460865" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.135/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 238sec preferred_lft 238sec 11792 1727096177.51688: no more pending results, returning what we have 11792 1727096177.51693: results queue empty 11792 1727096177.51693: checking for any_errors_fatal 11792 1727096177.51695: done checking for any_errors_fatal 11792 1727096177.51696: checking for max_fail_percentage 11792 1727096177.51698: done checking for max_fail_percentage 11792 1727096177.51699: checking to see if all hosts have failed and the running result is not ok 11792 1727096177.51700: done checking to see if all hosts have failed 11792 1727096177.51700: getting the remaining hosts for this loop 11792 1727096177.51702: done getting the remaining hosts for this loop 11792 1727096177.51705: getting the next task for host managed_node2 11792 1727096177.51720: done getting next task for host managed_node2 11792 1727096177.51723: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 11792 1727096177.51726: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096177.51731: getting variables 11792 1727096177.51732: in VariableManager get_vars() 11792 1727096177.51838: Calling all_inventory to load vars for managed_node2 11792 1727096177.51841: Calling groups_inventory to load vars for managed_node2 11792 1727096177.51844: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096177.51946: Calling all_plugins_play to load vars for managed_node2 11792 1727096177.51950: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096177.51953: Calling groups_plugins_play to load vars for managed_node2 11792 1727096177.52557: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000da6 11792 1727096177.52561: WORKER PROCESS EXITING 11792 1727096177.53696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096177.55379: done with get_vars() 11792 1727096177.55414: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Monday 23 September 2024 08:56:17 -0400 (0:00:00.450) 0:00:59.834 ****** 11792 1727096177.55520: entering _queue_task() for managed_node2/include_tasks 11792 1727096177.55899: worker is 1 (out of 1 available) 11792 1727096177.55912: exiting _queue_task() for managed_node2/include_tasks 11792 1727096177.55925: done queuing things up, now waiting for results queue to drain 11792 1727096177.55926: waiting for pending results... 11792 1727096177.56225: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv6_present.yml' 11792 1727096177.56348: in run() - task 0afff68d-5257-d9c7-3fc0-000000000c2d 11792 1727096177.56372: variable 'ansible_search_path' from source: unknown 11792 1727096177.56381: variable 'ansible_search_path' from source: unknown 11792 1727096177.56426: calling self._execute() 11792 1727096177.56539: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096177.56551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096177.56565: variable 'omit' from source: magic vars 11792 1727096177.56965: variable 'ansible_distribution_major_version' from source: facts 11792 1727096177.56983: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096177.57047: _execute() done 11792 1727096177.57050: dumping result to json 11792 1727096177.57053: done dumping result, returning 11792 1727096177.57055: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_IPv6_present.yml' [0afff68d-5257-d9c7-3fc0-000000000c2d] 11792 1727096177.57057: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000c2d 11792 1727096177.57131: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000c2d 11792 1727096177.57134: WORKER PROCESS EXITING 11792 1727096177.57180: no more pending results, returning what we have 11792 1727096177.57185: in VariableManager get_vars() 11792 1727096177.57238: Calling all_inventory to load vars for managed_node2 11792 1727096177.57241: Calling groups_inventory to load vars for managed_node2 11792 1727096177.57243: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096177.57275: Calling all_plugins_play to load vars for managed_node2 11792 1727096177.57279: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096177.57283: Calling groups_plugins_play to load vars for managed_node2 11792 1727096177.58942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096177.60692: done with get_vars() 11792 1727096177.60726: variable 'ansible_search_path' from source: unknown 11792 1727096177.60728: variable 'ansible_search_path' from source: unknown 11792 1727096177.60739: variable 'item' from source: include params 11792 1727096177.60847: variable 'item' from source: include params 11792 1727096177.60883: we have included files to process 11792 1727096177.60884: generating all_blocks data 11792 1727096177.60886: done generating all_blocks data 11792 1727096177.60891: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11792 1727096177.60892: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11792 1727096177.60894: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11792 1727096177.61086: done processing included file 11792 1727096177.61088: iterating over new_blocks loaded from include file 11792 1727096177.61089: in VariableManager get_vars() 11792 1727096177.61112: done with get_vars() 11792 1727096177.61114: filtering new block on tags 11792 1727096177.61141: done filtering new block on tags 11792 1727096177.61143: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node2 11792 1727096177.61149: extending task lists for all hosts with included blocks 11792 1727096177.61504: done extending task lists 11792 1727096177.61505: done processing included files 11792 1727096177.61506: results queue empty 11792 1727096177.61507: checking for any_errors_fatal 11792 1727096177.61512: done checking for any_errors_fatal 11792 1727096177.61513: checking for max_fail_percentage 11792 1727096177.61514: done checking for max_fail_percentage 11792 1727096177.61515: checking to see if all hosts have failed and the running result is not ok 11792 1727096177.61516: done checking to see if all hosts have failed 11792 1727096177.61517: getting the remaining hosts for this loop 11792 1727096177.61518: done getting the remaining hosts for this loop 11792 1727096177.61521: getting the next task for host managed_node2 11792 1727096177.61526: done getting next task for host managed_node2 11792 1727096177.61528: ^ task is: TASK: ** TEST check IPv6 11792 1727096177.61531: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096177.61534: getting variables 11792 1727096177.61535: in VariableManager get_vars() 11792 1727096177.61552: Calling all_inventory to load vars for managed_node2 11792 1727096177.61554: Calling groups_inventory to load vars for managed_node2 11792 1727096177.61556: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096177.61563: Calling all_plugins_play to load vars for managed_node2 11792 1727096177.61565: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096177.61570: Calling groups_plugins_play to load vars for managed_node2 11792 1727096177.62817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096177.64528: done with get_vars() 11792 1727096177.64560: done getting variables 11792 1727096177.64613: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Monday 23 September 2024 08:56:17 -0400 (0:00:00.091) 0:00:59.925 ****** 11792 1727096177.64653: entering _queue_task() for managed_node2/command 11792 1727096177.65032: worker is 1 (out of 1 available) 11792 1727096177.65044: exiting _queue_task() for managed_node2/command 11792 1727096177.65058: done queuing things up, now waiting for results queue to drain 11792 1727096177.65060: waiting for pending results... 11792 1727096177.65491: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 11792 1727096177.65499: in run() - task 0afff68d-5257-d9c7-3fc0-000000000dc7 11792 1727096177.65521: variable 'ansible_search_path' from source: unknown 11792 1727096177.65528: variable 'ansible_search_path' from source: unknown 11792 1727096177.65569: calling self._execute() 11792 1727096177.65678: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096177.65691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096177.65706: variable 'omit' from source: magic vars 11792 1727096177.66088: variable 'ansible_distribution_major_version' from source: facts 11792 1727096177.66107: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096177.66118: variable 'omit' from source: magic vars 11792 1727096177.66180: variable 'omit' from source: magic vars 11792 1727096177.66347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096177.68507: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096177.68643: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096177.68646: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096177.68674: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096177.68707: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096177.68803: variable 'controller_device' from source: play vars 11792 1727096177.68837: variable 'omit' from source: magic vars 11792 1727096177.68881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096177.68915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096177.68970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096177.68974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096177.68981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096177.69018: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096177.69029: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096177.69074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096177.69141: Set connection var ansible_timeout to 10 11792 1727096177.69156: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096177.69174: Set connection var ansible_shell_executable to /bin/sh 11792 1727096177.69188: Set connection var ansible_pipelining to False 11792 1727096177.69194: Set connection var ansible_shell_type to sh 11792 1727096177.69201: Set connection var ansible_connection to ssh 11792 1727096177.69226: variable 'ansible_shell_executable' from source: unknown 11792 1727096177.69290: variable 'ansible_connection' from source: unknown 11792 1727096177.69293: variable 'ansible_module_compression' from source: unknown 11792 1727096177.69296: variable 'ansible_shell_type' from source: unknown 11792 1727096177.69298: variable 'ansible_shell_executable' from source: unknown 11792 1727096177.69300: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096177.69302: variable 'ansible_pipelining' from source: unknown 11792 1727096177.69304: variable 'ansible_timeout' from source: unknown 11792 1727096177.69305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096177.69377: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096177.69395: variable 'omit' from source: magic vars 11792 1727096177.69409: starting attempt loop 11792 1727096177.69415: running the handler 11792 1727096177.69438: _low_level_execute_command(): starting 11792 1727096177.69449: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096177.70063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096177.70067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096177.70072: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.70121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096177.70124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.70132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.70174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.71897: stdout chunk (state=3): >>>/root <<< 11792 1727096177.72045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.72049: stdout chunk (state=3): >>><<< 11792 1727096177.72051: stderr chunk (state=3): >>><<< 11792 1727096177.72178: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096177.72182: _low_level_execute_command(): starting 11792 1727096177.72185: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414 `" && echo ansible-tmp-1727096177.720805-14577-20991749492414="` echo /root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414 `" ) && sleep 0' 11792 1727096177.72705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096177.72724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.72743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.72791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096177.72800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.72803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.72859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.74856: stdout chunk (state=3): >>>ansible-tmp-1727096177.720805-14577-20991749492414=/root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414 <<< 11792 1727096177.75077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.75081: stdout chunk (state=3): >>><<< 11792 1727096177.75084: stderr chunk (state=3): >>><<< 11792 1727096177.75086: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096177.720805-14577-20991749492414=/root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096177.75097: variable 'ansible_module_compression' from source: unknown 11792 1727096177.75152: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096177.75274: variable 'ansible_facts' from source: unknown 11792 1727096177.75306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/AnsiballZ_command.py 11792 1727096177.75559: Sending initial data 11792 1727096177.75562: Sent initial data (154 bytes) 11792 1727096177.76120: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096177.76126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096177.76156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.76159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096177.76162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096177.76164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.76220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096177.76228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.76232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.76287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.77937: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096177.77978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096177.78018: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp_6z64kx5 /root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/AnsiballZ_command.py <<< 11792 1727096177.78021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/AnsiballZ_command.py" <<< 11792 1727096177.78070: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp_6z64kx5" to remote "/root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/AnsiballZ_command.py" <<< 11792 1727096177.78832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.78934: stdout chunk (state=3): >>><<< 11792 1727096177.78937: stderr chunk (state=3): >>><<< 11792 1727096177.78939: done transferring module to remote 11792 1727096177.78941: _low_level_execute_command(): starting 11792 1727096177.78943: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/ /root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/AnsiballZ_command.py && sleep 0' 11792 1727096177.79538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.79568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.79630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.81503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096177.81529: stderr chunk (state=3): >>><<< 11792 1727096177.81533: stdout chunk (state=3): >>><<< 11792 1727096177.81553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096177.81557: _low_level_execute_command(): starting 11792 1727096177.81566: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/AnsiballZ_command.py && sleep 0' 11792 1727096177.82035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096177.82038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.82041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096177.82043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096177.82129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096177.82180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096177.82241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096177.99279: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::128/128 scope global dynamic noprefixroute \n valid_lft 239sec preferred_lft 239sec\n inet6 2001:db8::34e3:fdff:fe46:5fd7/64 scope global dynamic noprefixroute \n valid_lft 1797sec preferred_lft 1797sec\n inet6 fe80::34e3:fdff:fe46:5fd7/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-23 08:56:17.984210", "end": "2024-09-23 08:56:17.990348", "delta": "0:00:00.006138", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096178.01072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096178.01078: stdout chunk (state=3): >>><<< 11792 1727096178.01080: stderr chunk (state=3): >>><<< 11792 1727096178.01105: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::128/128 scope global dynamic noprefixroute \n valid_lft 239sec preferred_lft 239sec\n inet6 2001:db8::34e3:fdff:fe46:5fd7/64 scope global dynamic noprefixroute \n valid_lft 1797sec preferred_lft 1797sec\n inet6 fe80::34e3:fdff:fe46:5fd7/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-23 08:56:17.984210", "end": "2024-09-23 08:56:17.990348", "delta": "0:00:00.006138", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096178.01148: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096178.01241: _low_level_execute_command(): starting 11792 1727096178.01244: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096177.720805-14577-20991749492414/ > /dev/null 2>&1 && sleep 0' 11792 1727096178.01805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096178.01829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096178.01848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096178.01871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096178.01888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096178.01899: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096178.01911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096178.01982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096178.02015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096178.02041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096178.02069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096178.02139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096178.04390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096178.04395: stdout chunk (state=3): >>><<< 11792 1727096178.04398: stderr chunk (state=3): >>><<< 11792 1727096178.04485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096178.04489: handler run complete 11792 1727096178.04492: Evaluated conditional (False): False 11792 1727096178.04614: variable 'address' from source: include params 11792 1727096178.04617: variable 'result' from source: set_fact 11792 1727096178.04635: Evaluated conditional (address in result.stdout): True 11792 1727096178.04647: attempt loop complete, returning result 11792 1727096178.04650: _execute() done 11792 1727096178.04655: dumping result to json 11792 1727096178.04657: done dumping result, returning 11792 1727096178.04666: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 [0afff68d-5257-d9c7-3fc0-000000000dc7] 11792 1727096178.04773: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000dc7 11792 1727096178.04842: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000dc7 11792 1727096178.04845: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.006138", "end": "2024-09-23 08:56:17.990348", "rc": 0, "start": "2024-09-23 08:56:17.984210" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::128/128 scope global dynamic noprefixroute valid_lft 239sec preferred_lft 239sec inet6 2001:db8::34e3:fdff:fe46:5fd7/64 scope global dynamic noprefixroute valid_lft 1797sec preferred_lft 1797sec inet6 fe80::34e3:fdff:fe46:5fd7/64 scope link noprefixroute valid_lft forever preferred_lft forever 11792 1727096178.04938: no more pending results, returning what we have 11792 1727096178.04942: results queue empty 11792 1727096178.04943: checking for any_errors_fatal 11792 1727096178.04945: done checking for any_errors_fatal 11792 1727096178.04945: checking for max_fail_percentage 11792 1727096178.04947: done checking for max_fail_percentage 11792 1727096178.04948: checking to see if all hosts have failed and the running result is not ok 11792 1727096178.04949: done checking to see if all hosts have failed 11792 1727096178.04950: getting the remaining hosts for this loop 11792 1727096178.04952: done getting the remaining hosts for this loop 11792 1727096178.04955: getting the next task for host managed_node2 11792 1727096178.04966: done getting next task for host managed_node2 11792 1727096178.04970: ^ task is: TASK: Conditional asserts 11792 1727096178.04973: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096178.04978: getting variables 11792 1727096178.04980: in VariableManager get_vars() 11792 1727096178.05140: Calling all_inventory to load vars for managed_node2 11792 1727096178.05143: Calling groups_inventory to load vars for managed_node2 11792 1727096178.05146: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.05157: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.05161: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.05164: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.06625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.08447: done with get_vars() 11792 1727096178.08474: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Monday 23 September 2024 08:56:18 -0400 (0:00:00.439) 0:01:00.364 ****** 11792 1727096178.08578: entering _queue_task() for managed_node2/include_tasks 11792 1727096178.09031: worker is 1 (out of 1 available) 11792 1727096178.09043: exiting _queue_task() for managed_node2/include_tasks 11792 1727096178.09055: done queuing things up, now waiting for results queue to drain 11792 1727096178.09057: waiting for pending results... 11792 1727096178.09585: running TaskExecutor() for managed_node2/TASK: Conditional asserts 11792 1727096178.09591: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008f0 11792 1727096178.09594: variable 'ansible_search_path' from source: unknown 11792 1727096178.09597: variable 'ansible_search_path' from source: unknown 11792 1727096178.09730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096178.12001: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096178.12085: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096178.12119: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096178.12160: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096178.12188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096178.12277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096178.12307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096178.12331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096178.12379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096178.12392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096178.12547: dumping result to json 11792 1727096178.12550: done dumping result, returning 11792 1727096178.12558: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0afff68d-5257-d9c7-3fc0-0000000008f0] 11792 1727096178.12576: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008f0 skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 11792 1727096178.12731: no more pending results, returning what we have 11792 1727096178.12736: results queue empty 11792 1727096178.12737: checking for any_errors_fatal 11792 1727096178.12745: done checking for any_errors_fatal 11792 1727096178.12746: checking for max_fail_percentage 11792 1727096178.12748: done checking for max_fail_percentage 11792 1727096178.12749: checking to see if all hosts have failed and the running result is not ok 11792 1727096178.12750: done checking to see if all hosts have failed 11792 1727096178.12750: getting the remaining hosts for this loop 11792 1727096178.12752: done getting the remaining hosts for this loop 11792 1727096178.12756: getting the next task for host managed_node2 11792 1727096178.12763: done getting next task for host managed_node2 11792 1727096178.12766: ^ task is: TASK: Success in test '{{ lsr_description }}' 11792 1727096178.12774: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096178.12779: getting variables 11792 1727096178.12781: in VariableManager get_vars() 11792 1727096178.12835: Calling all_inventory to load vars for managed_node2 11792 1727096178.12838: Calling groups_inventory to load vars for managed_node2 11792 1727096178.12841: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.12854: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.12858: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.12861: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.13588: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008f0 11792 1727096178.13593: WORKER PROCESS EXITING 11792 1727096178.14974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.16611: done with get_vars() 11792 1727096178.16646: done getting variables 11792 1727096178.16715: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096178.16842: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Monday 23 September 2024 08:56:18 -0400 (0:00:00.082) 0:01:00.447 ****** 11792 1727096178.16879: entering _queue_task() for managed_node2/debug 11792 1727096178.17263: worker is 1 (out of 1 available) 11792 1727096178.17477: exiting _queue_task() for managed_node2/debug 11792 1727096178.17488: done queuing things up, now waiting for results queue to drain 11792 1727096178.17490: waiting for pending results... 11792 1727096178.17621: running TaskExecutor() for managed_node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 11792 1727096178.17826: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008f1 11792 1727096178.17832: variable 'ansible_search_path' from source: unknown 11792 1727096178.17836: variable 'ansible_search_path' from source: unknown 11792 1727096178.17840: calling self._execute() 11792 1727096178.17912: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.17928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.17947: variable 'omit' from source: magic vars 11792 1727096178.18374: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.18378: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.18381: variable 'omit' from source: magic vars 11792 1727096178.18406: variable 'omit' from source: magic vars 11792 1727096178.18521: variable 'lsr_description' from source: include params 11792 1727096178.18549: variable 'omit' from source: magic vars 11792 1727096178.18608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096178.18652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096178.18699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096178.18712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096178.18808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096178.18811: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096178.18813: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.18816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.18897: Set connection var ansible_timeout to 10 11792 1727096178.18917: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096178.18932: Set connection var ansible_shell_executable to /bin/sh 11792 1727096178.18942: Set connection var ansible_pipelining to False 11792 1727096178.18949: Set connection var ansible_shell_type to sh 11792 1727096178.18958: Set connection var ansible_connection to ssh 11792 1727096178.18986: variable 'ansible_shell_executable' from source: unknown 11792 1727096178.18994: variable 'ansible_connection' from source: unknown 11792 1727096178.19002: variable 'ansible_module_compression' from source: unknown 11792 1727096178.19009: variable 'ansible_shell_type' from source: unknown 11792 1727096178.19016: variable 'ansible_shell_executable' from source: unknown 11792 1727096178.19027: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.19037: variable 'ansible_pipelining' from source: unknown 11792 1727096178.19043: variable 'ansible_timeout' from source: unknown 11792 1727096178.19052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.19244: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096178.19248: variable 'omit' from source: magic vars 11792 1727096178.19251: starting attempt loop 11792 1727096178.19256: running the handler 11792 1727096178.19302: handler run complete 11792 1727096178.19322: attempt loop complete, returning result 11792 1727096178.19355: _execute() done 11792 1727096178.19359: dumping result to json 11792 1727096178.19361: done dumping result, returning 11792 1727096178.19364: done running TaskExecutor() for managed_node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [0afff68d-5257-d9c7-3fc0-0000000008f1] 11792 1727096178.19368: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008f1 11792 1727096178.19528: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008f1 11792 1727096178.19531: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 11792 1727096178.19620: no more pending results, returning what we have 11792 1727096178.19624: results queue empty 11792 1727096178.19625: checking for any_errors_fatal 11792 1727096178.19632: done checking for any_errors_fatal 11792 1727096178.19633: checking for max_fail_percentage 11792 1727096178.19635: done checking for max_fail_percentage 11792 1727096178.19636: checking to see if all hosts have failed and the running result is not ok 11792 1727096178.19637: done checking to see if all hosts have failed 11792 1727096178.19638: getting the remaining hosts for this loop 11792 1727096178.19639: done getting the remaining hosts for this loop 11792 1727096178.19644: getting the next task for host managed_node2 11792 1727096178.19652: done getting next task for host managed_node2 11792 1727096178.19658: ^ task is: TASK: Cleanup 11792 1727096178.19662: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096178.19670: getting variables 11792 1727096178.19671: in VariableManager get_vars() 11792 1727096178.19719: Calling all_inventory to load vars for managed_node2 11792 1727096178.19721: Calling groups_inventory to load vars for managed_node2 11792 1727096178.19724: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.19736: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.19739: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.19743: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.21532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.23135: done with get_vars() 11792 1727096178.23166: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Monday 23 September 2024 08:56:18 -0400 (0:00:00.063) 0:01:00.511 ****** 11792 1727096178.23266: entering _queue_task() for managed_node2/include_tasks 11792 1727096178.23644: worker is 1 (out of 1 available) 11792 1727096178.23658: exiting _queue_task() for managed_node2/include_tasks 11792 1727096178.23875: done queuing things up, now waiting for results queue to drain 11792 1727096178.23877: waiting for pending results... 11792 1727096178.24009: running TaskExecutor() for managed_node2/TASK: Cleanup 11792 1727096178.24174: in run() - task 0afff68d-5257-d9c7-3fc0-0000000008f5 11792 1727096178.24178: variable 'ansible_search_path' from source: unknown 11792 1727096178.24181: variable 'ansible_search_path' from source: unknown 11792 1727096178.24199: variable 'lsr_cleanup' from source: include params 11792 1727096178.24427: variable 'lsr_cleanup' from source: include params 11792 1727096178.24519: variable 'omit' from source: magic vars 11792 1727096178.24681: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.24698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.24763: variable 'omit' from source: magic vars 11792 1727096178.24994: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.25010: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.25021: variable 'item' from source: unknown 11792 1727096178.25096: variable 'item' from source: unknown 11792 1727096178.25132: variable 'item' from source: unknown 11792 1727096178.25204: variable 'item' from source: unknown 11792 1727096178.25683: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.25687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.25689: variable 'omit' from source: magic vars 11792 1727096178.25691: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.25693: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.25696: variable 'item' from source: unknown 11792 1727096178.25698: variable 'item' from source: unknown 11792 1727096178.25708: variable 'item' from source: unknown 11792 1727096178.25774: variable 'item' from source: unknown 11792 1727096178.25908: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.25921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.25935: variable 'omit' from source: magic vars 11792 1727096178.26094: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.26106: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.26120: variable 'item' from source: unknown 11792 1727096178.26226: variable 'item' from source: unknown 11792 1727096178.26230: variable 'item' from source: unknown 11792 1727096178.26287: variable 'item' from source: unknown 11792 1727096178.26493: dumping result to json 11792 1727096178.26495: done dumping result, returning 11792 1727096178.26498: done running TaskExecutor() for managed_node2/TASK: Cleanup [0afff68d-5257-d9c7-3fc0-0000000008f5] 11792 1727096178.26500: sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008f5 11792 1727096178.26534: done sending task result for task 0afff68d-5257-d9c7-3fc0-0000000008f5 11792 1727096178.26537: WORKER PROCESS EXITING 11792 1727096178.26567: no more pending results, returning what we have 11792 1727096178.26574: in VariableManager get_vars() 11792 1727096178.26623: Calling all_inventory to load vars for managed_node2 11792 1727096178.26626: Calling groups_inventory to load vars for managed_node2 11792 1727096178.26628: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.26644: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.26646: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.26649: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.28212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.29751: done with get_vars() 11792 1727096178.29777: variable 'ansible_search_path' from source: unknown 11792 1727096178.29779: variable 'ansible_search_path' from source: unknown 11792 1727096178.29820: variable 'ansible_search_path' from source: unknown 11792 1727096178.29821: variable 'ansible_search_path' from source: unknown 11792 1727096178.29852: variable 'ansible_search_path' from source: unknown 11792 1727096178.29856: variable 'ansible_search_path' from source: unknown 11792 1727096178.29886: we have included files to process 11792 1727096178.29887: generating all_blocks data 11792 1727096178.29890: done generating all_blocks data 11792 1727096178.29894: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11792 1727096178.29895: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11792 1727096178.29898: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11792 1727096178.30051: in VariableManager get_vars() 11792 1727096178.30079: done with get_vars() 11792 1727096178.30085: variable 'omit' from source: magic vars 11792 1727096178.30128: variable 'omit' from source: magic vars 11792 1727096178.30187: in VariableManager get_vars() 11792 1727096178.30205: done with get_vars() 11792 1727096178.30230: in VariableManager get_vars() 11792 1727096178.30251: done with get_vars() 11792 1727096178.30290: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11792 1727096178.30390: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11792 1727096178.30514: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11792 1727096178.30887: in VariableManager get_vars() 11792 1727096178.30910: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11792 1727096178.32874: done processing included file 11792 1727096178.32876: iterating over new_blocks loaded from include file 11792 1727096178.32878: in VariableManager get_vars() 11792 1727096178.32899: done with get_vars() 11792 1727096178.32901: filtering new block on tags 11792 1727096178.33311: done filtering new block on tags 11792 1727096178.33315: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node2 => (item=tasks/cleanup_bond_profile+device.yml) 11792 1727096178.33320: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11792 1727096178.33321: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11792 1727096178.33324: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11792 1727096178.33663: done processing included file 11792 1727096178.33665: iterating over new_blocks loaded from include file 11792 1727096178.33666: in VariableManager get_vars() 11792 1727096178.33688: done with get_vars() 11792 1727096178.33690: filtering new block on tags 11792 1727096178.33719: done filtering new block on tags 11792 1727096178.33721: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node2 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 11792 1727096178.38713: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11792 1727096178.38721: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11792 1727096178.38725: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11792 1727096178.39092: done processing included file 11792 1727096178.39094: iterating over new_blocks loaded from include file 11792 1727096178.39095: in VariableManager get_vars() 11792 1727096178.39118: done with get_vars() 11792 1727096178.39120: filtering new block on tags 11792 1727096178.39152: done filtering new block on tags 11792 1727096178.39157: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 => (item=tasks/check_network_dns.yml) 11792 1727096178.39162: extending task lists for all hosts with included blocks 11792 1727096178.42408: done extending task lists 11792 1727096178.42410: done processing included files 11792 1727096178.42411: results queue empty 11792 1727096178.42412: checking for any_errors_fatal 11792 1727096178.42415: done checking for any_errors_fatal 11792 1727096178.42416: checking for max_fail_percentage 11792 1727096178.42417: done checking for max_fail_percentage 11792 1727096178.42418: checking to see if all hosts have failed and the running result is not ok 11792 1727096178.42419: done checking to see if all hosts have failed 11792 1727096178.42420: getting the remaining hosts for this loop 11792 1727096178.42421: done getting the remaining hosts for this loop 11792 1727096178.42423: getting the next task for host managed_node2 11792 1727096178.42428: done getting next task for host managed_node2 11792 1727096178.42430: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11792 1727096178.42434: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096178.42445: getting variables 11792 1727096178.42446: in VariableManager get_vars() 11792 1727096178.42471: Calling all_inventory to load vars for managed_node2 11792 1727096178.42474: Calling groups_inventory to load vars for managed_node2 11792 1727096178.42476: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.42482: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.42484: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.42487: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.43650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.45189: done with get_vars() 11792 1727096178.45217: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:56:18 -0400 (0:00:00.220) 0:01:00.732 ****** 11792 1727096178.45308: entering _queue_task() for managed_node2/include_tasks 11792 1727096178.45691: worker is 1 (out of 1 available) 11792 1727096178.45703: exiting _queue_task() for managed_node2/include_tasks 11792 1727096178.45717: done queuing things up, now waiting for results queue to drain 11792 1727096178.45719: waiting for pending results... 11792 1727096178.45991: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11792 1727096178.46143: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e0a 11792 1727096178.46160: variable 'ansible_search_path' from source: unknown 11792 1727096178.46164: variable 'ansible_search_path' from source: unknown 11792 1727096178.46202: calling self._execute() 11792 1727096178.46311: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.46324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.46332: variable 'omit' from source: magic vars 11792 1727096178.46746: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.46845: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.46848: _execute() done 11792 1727096178.46850: dumping result to json 11792 1727096178.46852: done dumping result, returning 11792 1727096178.46857: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-d9c7-3fc0-000000000e0a] 11792 1727096178.46860: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0a 11792 1727096178.46924: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0a 11792 1727096178.46927: WORKER PROCESS EXITING 11792 1727096178.46997: no more pending results, returning what we have 11792 1727096178.47002: in VariableManager get_vars() 11792 1727096178.47053: Calling all_inventory to load vars for managed_node2 11792 1727096178.47058: Calling groups_inventory to load vars for managed_node2 11792 1727096178.47060: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.47074: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.47076: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.47079: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.48474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.50105: done with get_vars() 11792 1727096178.50129: variable 'ansible_search_path' from source: unknown 11792 1727096178.50131: variable 'ansible_search_path' from source: unknown 11792 1727096178.50177: we have included files to process 11792 1727096178.50179: generating all_blocks data 11792 1727096178.50180: done generating all_blocks data 11792 1727096178.50182: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096178.50183: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096178.50185: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11792 1727096178.50761: done processing included file 11792 1727096178.50763: iterating over new_blocks loaded from include file 11792 1727096178.50765: in VariableManager get_vars() 11792 1727096178.50798: done with get_vars() 11792 1727096178.50800: filtering new block on tags 11792 1727096178.50832: done filtering new block on tags 11792 1727096178.50836: in VariableManager get_vars() 11792 1727096178.50869: done with get_vars() 11792 1727096178.50871: filtering new block on tags 11792 1727096178.50921: done filtering new block on tags 11792 1727096178.50923: in VariableManager get_vars() 11792 1727096178.50951: done with get_vars() 11792 1727096178.50952: filtering new block on tags 11792 1727096178.51002: done filtering new block on tags 11792 1727096178.51005: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11792 1727096178.51011: extending task lists for all hosts with included blocks 11792 1727096178.53463: done extending task lists 11792 1727096178.53465: done processing included files 11792 1727096178.53466: results queue empty 11792 1727096178.53466: checking for any_errors_fatal 11792 1727096178.53473: done checking for any_errors_fatal 11792 1727096178.53473: checking for max_fail_percentage 11792 1727096178.53475: done checking for max_fail_percentage 11792 1727096178.53475: checking to see if all hosts have failed and the running result is not ok 11792 1727096178.53476: done checking to see if all hosts have failed 11792 1727096178.53477: getting the remaining hosts for this loop 11792 1727096178.53478: done getting the remaining hosts for this loop 11792 1727096178.53481: getting the next task for host managed_node2 11792 1727096178.53486: done getting next task for host managed_node2 11792 1727096178.53489: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11792 1727096178.53493: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096178.53506: getting variables 11792 1727096178.53508: in VariableManager get_vars() 11792 1727096178.53530: Calling all_inventory to load vars for managed_node2 11792 1727096178.53532: Calling groups_inventory to load vars for managed_node2 11792 1727096178.53534: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.53540: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.53542: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.53545: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.54880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.56735: done with get_vars() 11792 1727096178.56762: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:56:18 -0400 (0:00:00.115) 0:01:00.847 ****** 11792 1727096178.56859: entering _queue_task() for managed_node2/setup 11792 1727096178.57251: worker is 1 (out of 1 available) 11792 1727096178.57373: exiting _queue_task() for managed_node2/setup 11792 1727096178.57385: done queuing things up, now waiting for results queue to drain 11792 1727096178.57387: waiting for pending results... 11792 1727096178.57625: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11792 1727096178.57811: in run() - task 0afff68d-5257-d9c7-3fc0-000000000fde 11792 1727096178.57814: variable 'ansible_search_path' from source: unknown 11792 1727096178.57817: variable 'ansible_search_path' from source: unknown 11792 1727096178.57853: calling self._execute() 11792 1727096178.58027: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.58030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.58034: variable 'omit' from source: magic vars 11792 1727096178.58396: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.58413: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.58637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096178.61550: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096178.61639: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096178.61690: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096178.61731: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096178.61767: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096178.61901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096178.61905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096178.61925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096178.61973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096178.61993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096178.62052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096178.62085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096178.62120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096178.62226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096178.62229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096178.62360: variable '__network_required_facts' from source: role '' defaults 11792 1727096178.62377: variable 'ansible_facts' from source: unknown 11792 1727096178.63138: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11792 1727096178.63147: when evaluation is False, skipping this task 11792 1727096178.63158: _execute() done 11792 1727096178.63167: dumping result to json 11792 1727096178.63177: done dumping result, returning 11792 1727096178.63190: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-d9c7-3fc0-000000000fde] 11792 1727096178.63273: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fde 11792 1727096178.63573: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fde 11792 1727096178.63577: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096178.63615: no more pending results, returning what we have 11792 1727096178.63618: results queue empty 11792 1727096178.63619: checking for any_errors_fatal 11792 1727096178.63620: done checking for any_errors_fatal 11792 1727096178.63621: checking for max_fail_percentage 11792 1727096178.63623: done checking for max_fail_percentage 11792 1727096178.63624: checking to see if all hosts have failed and the running result is not ok 11792 1727096178.63624: done checking to see if all hosts have failed 11792 1727096178.63625: getting the remaining hosts for this loop 11792 1727096178.63626: done getting the remaining hosts for this loop 11792 1727096178.63630: getting the next task for host managed_node2 11792 1727096178.63641: done getting next task for host managed_node2 11792 1727096178.63645: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11792 1727096178.63651: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096178.63675: getting variables 11792 1727096178.63677: in VariableManager get_vars() 11792 1727096178.63724: Calling all_inventory to load vars for managed_node2 11792 1727096178.63727: Calling groups_inventory to load vars for managed_node2 11792 1727096178.63730: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.63740: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.63744: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.63753: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.65577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.68851: done with get_vars() 11792 1727096178.69087: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:56:18 -0400 (0:00:00.123) 0:01:00.971 ****** 11792 1727096178.69197: entering _queue_task() for managed_node2/stat 11792 1727096178.69966: worker is 1 (out of 1 available) 11792 1727096178.69982: exiting _queue_task() for managed_node2/stat 11792 1727096178.69997: done queuing things up, now waiting for results queue to drain 11792 1727096178.69998: waiting for pending results... 11792 1727096178.70462: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11792 1727096178.70804: in run() - task 0afff68d-5257-d9c7-3fc0-000000000fe0 11792 1727096178.70817: variable 'ansible_search_path' from source: unknown 11792 1727096178.70821: variable 'ansible_search_path' from source: unknown 11792 1727096178.70861: calling self._execute() 11792 1727096178.70961: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.70968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.71183: variable 'omit' from source: magic vars 11792 1727096178.71973: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.71987: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.72152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096178.72873: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096178.72881: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096178.72902: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096178.72937: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096178.73449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096178.73453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096178.73459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096178.73463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096178.73465: variable '__network_is_ostree' from source: set_fact 11792 1727096178.73470: Evaluated conditional (not __network_is_ostree is defined): False 11792 1727096178.73473: when evaluation is False, skipping this task 11792 1727096178.73475: _execute() done 11792 1727096178.73479: dumping result to json 11792 1727096178.73481: done dumping result, returning 11792 1727096178.73484: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-d9c7-3fc0-000000000fe0] 11792 1727096178.73489: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fe0 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11792 1727096178.73656: no more pending results, returning what we have 11792 1727096178.73660: results queue empty 11792 1727096178.73661: checking for any_errors_fatal 11792 1727096178.73672: done checking for any_errors_fatal 11792 1727096178.73673: checking for max_fail_percentage 11792 1727096178.73675: done checking for max_fail_percentage 11792 1727096178.73676: checking to see if all hosts have failed and the running result is not ok 11792 1727096178.73677: done checking to see if all hosts have failed 11792 1727096178.73678: getting the remaining hosts for this loop 11792 1727096178.73681: done getting the remaining hosts for this loop 11792 1727096178.73685: getting the next task for host managed_node2 11792 1727096178.73695: done getting next task for host managed_node2 11792 1727096178.73700: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11792 1727096178.73708: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096178.73731: getting variables 11792 1727096178.73732: in VariableManager get_vars() 11792 1727096178.73897: Calling all_inventory to load vars for managed_node2 11792 1727096178.73900: Calling groups_inventory to load vars for managed_node2 11792 1727096178.73902: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.73914: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.73917: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.73920: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.74584: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fe0 11792 1727096178.74588: WORKER PROCESS EXITING 11792 1727096178.76196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.77836: done with get_vars() 11792 1727096178.77866: done getting variables 11792 1727096178.77938: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:56:18 -0400 (0:00:00.087) 0:01:01.059 ****** 11792 1727096178.77990: entering _queue_task() for managed_node2/set_fact 11792 1727096178.78422: worker is 1 (out of 1 available) 11792 1727096178.78434: exiting _queue_task() for managed_node2/set_fact 11792 1727096178.78448: done queuing things up, now waiting for results queue to drain 11792 1727096178.78450: waiting for pending results... 11792 1727096178.78730: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11792 1727096178.78920: in run() - task 0afff68d-5257-d9c7-3fc0-000000000fe1 11792 1727096178.78951: variable 'ansible_search_path' from source: unknown 11792 1727096178.78969: variable 'ansible_search_path' from source: unknown 11792 1727096178.79019: calling self._execute() 11792 1727096178.79187: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.79191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.79194: variable 'omit' from source: magic vars 11792 1727096178.79602: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.79623: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.79803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096178.80168: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096178.80172: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096178.80199: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096178.80261: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096178.80384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096178.80403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096178.80434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096178.80467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096178.80575: variable '__network_is_ostree' from source: set_fact 11792 1727096178.80595: Evaluated conditional (not __network_is_ostree is defined): False 11792 1727096178.80611: when evaluation is False, skipping this task 11792 1727096178.80626: _execute() done 11792 1727096178.80635: dumping result to json 11792 1727096178.80643: done dumping result, returning 11792 1727096178.80703: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-d9c7-3fc0-000000000fe1] 11792 1727096178.80708: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fe1 11792 1727096178.80780: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fe1 11792 1727096178.80783: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11792 1727096178.80832: no more pending results, returning what we have 11792 1727096178.80836: results queue empty 11792 1727096178.80836: checking for any_errors_fatal 11792 1727096178.80843: done checking for any_errors_fatal 11792 1727096178.80844: checking for max_fail_percentage 11792 1727096178.80846: done checking for max_fail_percentage 11792 1727096178.80847: checking to see if all hosts have failed and the running result is not ok 11792 1727096178.80848: done checking to see if all hosts have failed 11792 1727096178.80848: getting the remaining hosts for this loop 11792 1727096178.80850: done getting the remaining hosts for this loop 11792 1727096178.80853: getting the next task for host managed_node2 11792 1727096178.80868: done getting next task for host managed_node2 11792 1727096178.80873: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11792 1727096178.80879: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096178.80902: getting variables 11792 1727096178.80905: in VariableManager get_vars() 11792 1727096178.80953: Calling all_inventory to load vars for managed_node2 11792 1727096178.80959: Calling groups_inventory to load vars for managed_node2 11792 1727096178.80961: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096178.81194: Calling all_plugins_play to load vars for managed_node2 11792 1727096178.81198: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096178.81202: Calling groups_plugins_play to load vars for managed_node2 11792 1727096178.82861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096178.84615: done with get_vars() 11792 1727096178.84644: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:56:18 -0400 (0:00:00.067) 0:01:01.126 ****** 11792 1727096178.84762: entering _queue_task() for managed_node2/service_facts 11792 1727096178.85295: worker is 1 (out of 1 available) 11792 1727096178.85305: exiting _queue_task() for managed_node2/service_facts 11792 1727096178.85316: done queuing things up, now waiting for results queue to drain 11792 1727096178.85318: waiting for pending results... 11792 1727096178.85671: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11792 1727096178.85766: in run() - task 0afff68d-5257-d9c7-3fc0-000000000fe3 11792 1727096178.85771: variable 'ansible_search_path' from source: unknown 11792 1727096178.85774: variable 'ansible_search_path' from source: unknown 11792 1727096178.85777: calling self._execute() 11792 1727096178.85878: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.85897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.85912: variable 'omit' from source: magic vars 11792 1727096178.86313: variable 'ansible_distribution_major_version' from source: facts 11792 1727096178.86329: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096178.86339: variable 'omit' from source: magic vars 11792 1727096178.86527: variable 'omit' from source: magic vars 11792 1727096178.86530: variable 'omit' from source: magic vars 11792 1727096178.86533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096178.86576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096178.86602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096178.86625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096178.86652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096178.86690: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096178.86700: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.86708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.86853: Set connection var ansible_timeout to 10 11792 1727096178.86859: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096178.86861: Set connection var ansible_shell_executable to /bin/sh 11792 1727096178.86864: Set connection var ansible_pipelining to False 11792 1727096178.86866: Set connection var ansible_shell_type to sh 11792 1727096178.86870: Set connection var ansible_connection to ssh 11792 1727096178.86894: variable 'ansible_shell_executable' from source: unknown 11792 1727096178.86901: variable 'ansible_connection' from source: unknown 11792 1727096178.86964: variable 'ansible_module_compression' from source: unknown 11792 1727096178.86969: variable 'ansible_shell_type' from source: unknown 11792 1727096178.86978: variable 'ansible_shell_executable' from source: unknown 11792 1727096178.86987: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096178.86990: variable 'ansible_pipelining' from source: unknown 11792 1727096178.86992: variable 'ansible_timeout' from source: unknown 11792 1727096178.86995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096178.87205: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096178.87220: variable 'omit' from source: magic vars 11792 1727096178.87229: starting attempt loop 11792 1727096178.87236: running the handler 11792 1727096178.87252: _low_level_execute_command(): starting 11792 1727096178.87266: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096178.88173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096178.88198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096178.88473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096178.89952: stdout chunk (state=3): >>>/root <<< 11792 1727096178.90292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096178.90296: stderr chunk (state=3): >>><<< 11792 1727096178.90298: stdout chunk (state=3): >>><<< 11792 1727096178.90301: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096178.90304: _low_level_execute_command(): starting 11792 1727096178.90307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184 `" && echo ansible-tmp-1727096178.9013007-14617-162837010133184="` echo /root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184 `" ) && sleep 0' 11792 1727096178.91489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096178.91936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096178.92046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096178.92052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096178.94015: stdout chunk (state=3): >>>ansible-tmp-1727096178.9013007-14617-162837010133184=/root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184 <<< 11792 1727096178.94107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096178.94137: stderr chunk (state=3): >>><<< 11792 1727096178.94147: stdout chunk (state=3): >>><<< 11792 1727096178.94177: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096178.9013007-14617-162837010133184=/root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096178.94281: variable 'ansible_module_compression' from source: unknown 11792 1727096178.94448: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11792 1727096178.94499: variable 'ansible_facts' from source: unknown 11792 1727096178.94814: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/AnsiballZ_service_facts.py 11792 1727096178.95145: Sending initial data 11792 1727096178.95246: Sent initial data (162 bytes) 11792 1727096178.96590: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096178.96763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096178.96880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096178.96984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096178.98754: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096178.98789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/AnsiballZ_service_facts.py" <<< 11792 1727096178.98845: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpghxsb4gd /root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/AnsiballZ_service_facts.py <<< 11792 1727096178.99090: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpghxsb4gd" to remote "/root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/AnsiballZ_service_facts.py" <<< 11792 1727096179.00554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096179.00625: stderr chunk (state=3): >>><<< 11792 1727096179.00635: stdout chunk (state=3): >>><<< 11792 1727096179.00762: done transferring module to remote 11792 1727096179.00785: _low_level_execute_command(): starting 11792 1727096179.00794: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/ /root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/AnsiballZ_service_facts.py && sleep 0' 11792 1727096179.02043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096179.02206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096179.02423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096179.02465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096179.04483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096179.04487: stdout chunk (state=3): >>><<< 11792 1727096179.04490: stderr chunk (state=3): >>><<< 11792 1727096179.04665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096179.04676: _low_level_execute_command(): starting 11792 1727096179.04679: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/AnsiballZ_service_facts.py && sleep 0' 11792 1727096179.05826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096179.05830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096179.05833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096179.05835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096179.06111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096179.06191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096180.80100: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 11792 1727096180.80177: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11792 1727096180.82287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096180.82291: stdout chunk (state=3): >>><<< 11792 1727096180.82293: stderr chunk (state=3): >>><<< 11792 1727096180.82297: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096180.83253: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096180.83257: _low_level_execute_command(): starting 11792 1727096180.83259: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096178.9013007-14617-162837010133184/ > /dev/null 2>&1 && sleep 0' 11792 1727096180.83853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096180.83948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096180.83952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096180.83955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096180.83957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096180.83959: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096180.83962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096180.83964: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096180.83966: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096180.83970: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096180.83972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096180.83974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096180.83976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096180.83978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096180.83980: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096180.83992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096180.84058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096180.84121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096180.84124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096180.84173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096180.86287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096180.86291: stdout chunk (state=3): >>><<< 11792 1727096180.86293: stderr chunk (state=3): >>><<< 11792 1727096180.86296: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096180.86298: handler run complete 11792 1727096180.86432: variable 'ansible_facts' from source: unknown 11792 1727096180.86593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096180.87239: variable 'ansible_facts' from source: unknown 11792 1727096180.87381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096180.87597: attempt loop complete, returning result 11792 1727096180.87608: _execute() done 11792 1727096180.87615: dumping result to json 11792 1727096180.87682: done dumping result, returning 11792 1727096180.87696: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-d9c7-3fc0-000000000fe3] 11792 1727096180.87704: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fe3 11792 1727096180.89002: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fe3 11792 1727096180.89006: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096180.89120: no more pending results, returning what we have 11792 1727096180.89125: results queue empty 11792 1727096180.89125: checking for any_errors_fatal 11792 1727096180.89129: done checking for any_errors_fatal 11792 1727096180.89130: checking for max_fail_percentage 11792 1727096180.89131: done checking for max_fail_percentage 11792 1727096180.89132: checking to see if all hosts have failed and the running result is not ok 11792 1727096180.89133: done checking to see if all hosts have failed 11792 1727096180.89134: getting the remaining hosts for this loop 11792 1727096180.89135: done getting the remaining hosts for this loop 11792 1727096180.89138: getting the next task for host managed_node2 11792 1727096180.89144: done getting next task for host managed_node2 11792 1727096180.89147: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11792 1727096180.89155: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096180.89174: getting variables 11792 1727096180.89175: in VariableManager get_vars() 11792 1727096180.89208: Calling all_inventory to load vars for managed_node2 11792 1727096180.89211: Calling groups_inventory to load vars for managed_node2 11792 1727096180.89213: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096180.89222: Calling all_plugins_play to load vars for managed_node2 11792 1727096180.89224: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096180.89227: Calling groups_plugins_play to load vars for managed_node2 11792 1727096180.90470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096180.92253: done with get_vars() 11792 1727096180.92288: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:56:20 -0400 (0:00:02.078) 0:01:03.205 ****** 11792 1727096180.92586: entering _queue_task() for managed_node2/package_facts 11792 1727096180.93331: worker is 1 (out of 1 available) 11792 1727096180.93345: exiting _queue_task() for managed_node2/package_facts 11792 1727096180.93362: done queuing things up, now waiting for results queue to drain 11792 1727096180.93364: waiting for pending results... 11792 1727096180.94180: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11792 1727096180.94321: in run() - task 0afff68d-5257-d9c7-3fc0-000000000fe4 11792 1727096180.94446: variable 'ansible_search_path' from source: unknown 11792 1727096180.94458: variable 'ansible_search_path' from source: unknown 11792 1727096180.94502: calling self._execute() 11792 1727096180.94625: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096180.94648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096180.94667: variable 'omit' from source: magic vars 11792 1727096180.95044: variable 'ansible_distribution_major_version' from source: facts 11792 1727096180.95054: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096180.95066: variable 'omit' from source: magic vars 11792 1727096180.95136: variable 'omit' from source: magic vars 11792 1727096180.95161: variable 'omit' from source: magic vars 11792 1727096180.95199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096180.95228: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096180.95244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096180.95260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096180.95269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096180.95295: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096180.95299: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096180.95302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096180.95372: Set connection var ansible_timeout to 10 11792 1727096180.95379: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096180.95390: Set connection var ansible_shell_executable to /bin/sh 11792 1727096180.95394: Set connection var ansible_pipelining to False 11792 1727096180.95396: Set connection var ansible_shell_type to sh 11792 1727096180.95398: Set connection var ansible_connection to ssh 11792 1727096180.95416: variable 'ansible_shell_executable' from source: unknown 11792 1727096180.95419: variable 'ansible_connection' from source: unknown 11792 1727096180.95422: variable 'ansible_module_compression' from source: unknown 11792 1727096180.95424: variable 'ansible_shell_type' from source: unknown 11792 1727096180.95426: variable 'ansible_shell_executable' from source: unknown 11792 1727096180.95428: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096180.95431: variable 'ansible_pipelining' from source: unknown 11792 1727096180.95436: variable 'ansible_timeout' from source: unknown 11792 1727096180.95439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096180.95590: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096180.95599: variable 'omit' from source: magic vars 11792 1727096180.95607: starting attempt loop 11792 1727096180.95610: running the handler 11792 1727096180.95622: _low_level_execute_command(): starting 11792 1727096180.95626: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096180.96175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096180.96180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096180.96184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096180.96297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096180.96320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096180.98028: stdout chunk (state=3): >>>/root <<< 11792 1727096180.98127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096180.98311: stderr chunk (state=3): >>><<< 11792 1727096180.98315: stdout chunk (state=3): >>><<< 11792 1727096180.98319: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096180.98321: _low_level_execute_command(): starting 11792 1727096180.98324: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064 `" && echo ansible-tmp-1727096180.9819498-14677-37582157586064="` echo /root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064 `" ) && sleep 0' 11792 1727096180.98873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096180.98883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096180.98887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096180.98890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096180.98896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096180.98899: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096180.98910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096180.98913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096180.98915: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096180.98918: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096180.98920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096180.98922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096180.98924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096180.98926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096180.98928: stderr chunk (state=3): >>>debug2: match found <<< 11792 1727096180.98980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096180.99033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096180.99113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096180.99131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096180.99352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096181.01407: stdout chunk (state=3): >>>ansible-tmp-1727096180.9819498-14677-37582157586064=/root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064 <<< 11792 1727096181.01676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096181.01680: stdout chunk (state=3): >>><<< 11792 1727096181.01682: stderr chunk (state=3): >>><<< 11792 1727096181.01685: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096180.9819498-14677-37582157586064=/root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096181.01687: variable 'ansible_module_compression' from source: unknown 11792 1727096181.01710: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11792 1727096181.01783: variable 'ansible_facts' from source: unknown 11792 1727096181.01978: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/AnsiballZ_package_facts.py 11792 1727096181.02152: Sending initial data 11792 1727096181.02155: Sent initial data (161 bytes) 11792 1727096181.03034: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096181.04717: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11792 1727096181.04735: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11792 1727096181.04753: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11792 1727096181.04767: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11792 1727096181.04798: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096181.04983: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096181.05141: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpsp15lv88 /root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/AnsiballZ_package_facts.py <<< 11792 1727096181.05145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/AnsiballZ_package_facts.py" <<< 11792 1727096181.05180: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpsp15lv88" to remote "/root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/AnsiballZ_package_facts.py" <<< 11792 1727096181.07221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096181.07227: stdout chunk (state=3): >>><<< 11792 1727096181.07229: stderr chunk (state=3): >>><<< 11792 1727096181.07231: done transferring module to remote 11792 1727096181.07234: _low_level_execute_command(): starting 11792 1727096181.07386: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/ /root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/AnsiballZ_package_facts.py && sleep 0' 11792 1727096181.09085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096181.09128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096181.11012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096181.11034: stderr chunk (state=3): >>><<< 11792 1727096181.11047: stdout chunk (state=3): >>><<< 11792 1727096181.11073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096181.11077: _low_level_execute_command(): starting 11792 1727096181.11079: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/AnsiballZ_package_facts.py && sleep 0' 11792 1727096181.11548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096181.11552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096181.11554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096181.11559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096181.11561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096181.11602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096181.11620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096181.11720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096181.58321: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11792 1727096181.58338: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11792 1727096181.58387: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 11792 1727096181.58399: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 11792 1727096181.58416: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 11792 1727096181.58435: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 11792 1727096181.58487: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11792 1727096181.58497: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 11792 1727096181.58506: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11792 1727096181.58526: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11792 1727096181.60492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096181.60515: stderr chunk (state=3): >>><<< 11792 1727096181.60518: stdout chunk (state=3): >>><<< 11792 1727096181.60588: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096181.61905: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096181.61923: _low_level_execute_command(): starting 11792 1727096181.61926: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096180.9819498-14677-37582157586064/ > /dev/null 2>&1 && sleep 0' 11792 1727096181.62412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096181.62416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096181.62419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096181.62421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096181.62472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096181.62476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096181.62478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096181.62526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096181.64429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096181.64454: stderr chunk (state=3): >>><<< 11792 1727096181.64457: stdout chunk (state=3): >>><<< 11792 1727096181.64477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096181.64486: handler run complete 11792 1727096181.64954: variable 'ansible_facts' from source: unknown 11792 1727096181.65434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096181.67913: variable 'ansible_facts' from source: unknown 11792 1727096181.68481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096181.69002: attempt loop complete, returning result 11792 1727096181.69006: _execute() done 11792 1727096181.69010: dumping result to json 11792 1727096181.69385: done dumping result, returning 11792 1727096181.69388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-d9c7-3fc0-000000000fe4] 11792 1727096181.69391: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fe4 11792 1727096181.72445: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000fe4 11792 1727096181.72449: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096181.72623: no more pending results, returning what we have 11792 1727096181.72627: results queue empty 11792 1727096181.72628: checking for any_errors_fatal 11792 1727096181.72634: done checking for any_errors_fatal 11792 1727096181.72635: checking for max_fail_percentage 11792 1727096181.72636: done checking for max_fail_percentage 11792 1727096181.72637: checking to see if all hosts have failed and the running result is not ok 11792 1727096181.72638: done checking to see if all hosts have failed 11792 1727096181.72639: getting the remaining hosts for this loop 11792 1727096181.72640: done getting the remaining hosts for this loop 11792 1727096181.72643: getting the next task for host managed_node2 11792 1727096181.72651: done getting next task for host managed_node2 11792 1727096181.72655: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11792 1727096181.72660: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096181.72675: getting variables 11792 1727096181.72676: in VariableManager get_vars() 11792 1727096181.72712: Calling all_inventory to load vars for managed_node2 11792 1727096181.72715: Calling groups_inventory to load vars for managed_node2 11792 1727096181.72717: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096181.72726: Calling all_plugins_play to load vars for managed_node2 11792 1727096181.72734: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096181.72737: Calling groups_plugins_play to load vars for managed_node2 11792 1727096181.75445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096181.77776: done with get_vars() 11792 1727096181.77810: done getting variables 11792 1727096181.77888: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:56:21 -0400 (0:00:00.853) 0:01:04.058 ****** 11792 1727096181.77932: entering _queue_task() for managed_node2/debug 11792 1727096181.78347: worker is 1 (out of 1 available) 11792 1727096181.78364: exiting _queue_task() for managed_node2/debug 11792 1727096181.78381: done queuing things up, now waiting for results queue to drain 11792 1727096181.78383: waiting for pending results... 11792 1727096181.78964: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11792 1727096181.79166: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e0b 11792 1727096181.79212: variable 'ansible_search_path' from source: unknown 11792 1727096181.79300: variable 'ansible_search_path' from source: unknown 11792 1727096181.79326: calling self._execute() 11792 1727096181.79535: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096181.79546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096181.79677: variable 'omit' from source: magic vars 11792 1727096181.80355: variable 'ansible_distribution_major_version' from source: facts 11792 1727096181.80445: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096181.80458: variable 'omit' from source: magic vars 11792 1727096181.80576: variable 'omit' from source: magic vars 11792 1727096181.80695: variable 'network_provider' from source: set_fact 11792 1727096181.80730: variable 'omit' from source: magic vars 11792 1727096181.80778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096181.80823: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096181.80849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096181.80873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096181.80888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096181.80924: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096181.80938: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096181.80946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096181.81052: Set connection var ansible_timeout to 10 11792 1727096181.81066: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096181.81084: Set connection var ansible_shell_executable to /bin/sh 11792 1727096181.81095: Set connection var ansible_pipelining to False 11792 1727096181.81101: Set connection var ansible_shell_type to sh 11792 1727096181.81107: Set connection var ansible_connection to ssh 11792 1727096181.81147: variable 'ansible_shell_executable' from source: unknown 11792 1727096181.81151: variable 'ansible_connection' from source: unknown 11792 1727096181.81153: variable 'ansible_module_compression' from source: unknown 11792 1727096181.81156: variable 'ansible_shell_type' from source: unknown 11792 1727096181.81158: variable 'ansible_shell_executable' from source: unknown 11792 1727096181.81372: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096181.81377: variable 'ansible_pipelining' from source: unknown 11792 1727096181.81380: variable 'ansible_timeout' from source: unknown 11792 1727096181.81384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096181.81391: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096181.81394: variable 'omit' from source: magic vars 11792 1727096181.81395: starting attempt loop 11792 1727096181.81397: running the handler 11792 1727096181.81399: handler run complete 11792 1727096181.81410: attempt loop complete, returning result 11792 1727096181.81416: _execute() done 11792 1727096181.81421: dumping result to json 11792 1727096181.81426: done dumping result, returning 11792 1727096181.81436: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-d9c7-3fc0-000000000e0b] 11792 1727096181.81443: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0b ok: [managed_node2] => {} MSG: Using network provider: nm 11792 1727096181.81595: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0b 11792 1727096181.81600: WORKER PROCESS EXITING 11792 1727096181.81617: no more pending results, returning what we have 11792 1727096181.81621: results queue empty 11792 1727096181.81622: checking for any_errors_fatal 11792 1727096181.81630: done checking for any_errors_fatal 11792 1727096181.81631: checking for max_fail_percentage 11792 1727096181.81632: done checking for max_fail_percentage 11792 1727096181.81633: checking to see if all hosts have failed and the running result is not ok 11792 1727096181.81634: done checking to see if all hosts have failed 11792 1727096181.81634: getting the remaining hosts for this loop 11792 1727096181.81636: done getting the remaining hosts for this loop 11792 1727096181.81639: getting the next task for host managed_node2 11792 1727096181.81646: done getting next task for host managed_node2 11792 1727096181.81650: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11792 1727096181.81658: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096181.81673: getting variables 11792 1727096181.81674: in VariableManager get_vars() 11792 1727096181.81721: Calling all_inventory to load vars for managed_node2 11792 1727096181.81724: Calling groups_inventory to load vars for managed_node2 11792 1727096181.81726: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096181.81735: Calling all_plugins_play to load vars for managed_node2 11792 1727096181.81738: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096181.81741: Calling groups_plugins_play to load vars for managed_node2 11792 1727096181.83377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096181.85760: done with get_vars() 11792 1727096181.85796: done getting variables 11792 1727096181.85858: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:56:21 -0400 (0:00:00.079) 0:01:04.138 ****** 11792 1727096181.85904: entering _queue_task() for managed_node2/fail 11792 1727096181.86263: worker is 1 (out of 1 available) 11792 1727096181.86277: exiting _queue_task() for managed_node2/fail 11792 1727096181.86290: done queuing things up, now waiting for results queue to drain 11792 1727096181.86292: waiting for pending results... 11792 1727096181.86549: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11792 1727096181.86732: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e0c 11792 1727096181.86758: variable 'ansible_search_path' from source: unknown 11792 1727096181.86765: variable 'ansible_search_path' from source: unknown 11792 1727096181.86809: calling self._execute() 11792 1727096181.86902: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096181.86921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096181.86964: variable 'omit' from source: magic vars 11792 1727096181.87986: variable 'ansible_distribution_major_version' from source: facts 11792 1727096181.87990: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096181.88098: variable 'network_state' from source: role '' defaults 11792 1727096181.88116: Evaluated conditional (network_state != {}): False 11792 1727096181.88139: when evaluation is False, skipping this task 11792 1727096181.88156: _execute() done 11792 1727096181.88170: dumping result to json 11792 1727096181.88180: done dumping result, returning 11792 1727096181.88316: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-d9c7-3fc0-000000000e0c] 11792 1727096181.88320: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0c 11792 1727096181.88402: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0c 11792 1727096181.88406: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096181.88463: no more pending results, returning what we have 11792 1727096181.88470: results queue empty 11792 1727096181.88471: checking for any_errors_fatal 11792 1727096181.88481: done checking for any_errors_fatal 11792 1727096181.88481: checking for max_fail_percentage 11792 1727096181.88484: done checking for max_fail_percentage 11792 1727096181.88485: checking to see if all hosts have failed and the running result is not ok 11792 1727096181.88486: done checking to see if all hosts have failed 11792 1727096181.88487: getting the remaining hosts for this loop 11792 1727096181.88488: done getting the remaining hosts for this loop 11792 1727096181.88493: getting the next task for host managed_node2 11792 1727096181.88502: done getting next task for host managed_node2 11792 1727096181.88506: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11792 1727096181.88512: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096181.88538: getting variables 11792 1727096181.88540: in VariableManager get_vars() 11792 1727096181.88596: Calling all_inventory to load vars for managed_node2 11792 1727096181.88599: Calling groups_inventory to load vars for managed_node2 11792 1727096181.88602: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096181.88616: Calling all_plugins_play to load vars for managed_node2 11792 1727096181.88619: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096181.88622: Calling groups_plugins_play to load vars for managed_node2 11792 1727096181.91111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096181.92947: done with get_vars() 11792 1727096181.92984: done getting variables 11792 1727096181.93046: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:56:21 -0400 (0:00:00.071) 0:01:04.210 ****** 11792 1727096181.93087: entering _queue_task() for managed_node2/fail 11792 1727096181.93451: worker is 1 (out of 1 available) 11792 1727096181.93464: exiting _queue_task() for managed_node2/fail 11792 1727096181.93583: done queuing things up, now waiting for results queue to drain 11792 1727096181.93585: waiting for pending results... 11792 1727096181.93988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11792 1727096181.94009: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e0d 11792 1727096181.94035: variable 'ansible_search_path' from source: unknown 11792 1727096181.94053: variable 'ansible_search_path' from source: unknown 11792 1727096181.94104: calling self._execute() 11792 1727096181.94216: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096181.94229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096181.94244: variable 'omit' from source: magic vars 11792 1727096181.94639: variable 'ansible_distribution_major_version' from source: facts 11792 1727096181.94660: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096181.94802: variable 'network_state' from source: role '' defaults 11792 1727096181.94819: Evaluated conditional (network_state != {}): False 11792 1727096181.94828: when evaluation is False, skipping this task 11792 1727096181.94839: _execute() done 11792 1727096181.94850: dumping result to json 11792 1727096181.94873: done dumping result, returning 11792 1727096181.94878: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-d9c7-3fc0-000000000e0d] 11792 1727096181.94881: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0d 11792 1727096181.95032: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0d 11792 1727096181.95037: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096181.95114: no more pending results, returning what we have 11792 1727096181.95119: results queue empty 11792 1727096181.95120: checking for any_errors_fatal 11792 1727096181.95129: done checking for any_errors_fatal 11792 1727096181.95130: checking for max_fail_percentage 11792 1727096181.95132: done checking for max_fail_percentage 11792 1727096181.95133: checking to see if all hosts have failed and the running result is not ok 11792 1727096181.95134: done checking to see if all hosts have failed 11792 1727096181.95135: getting the remaining hosts for this loop 11792 1727096181.95136: done getting the remaining hosts for this loop 11792 1727096181.95141: getting the next task for host managed_node2 11792 1727096181.95150: done getting next task for host managed_node2 11792 1727096181.95154: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11792 1727096181.95161: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096181.95189: getting variables 11792 1727096181.95191: in VariableManager get_vars() 11792 1727096181.95246: Calling all_inventory to load vars for managed_node2 11792 1727096181.95249: Calling groups_inventory to load vars for managed_node2 11792 1727096181.95252: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096181.95266: Calling all_plugins_play to load vars for managed_node2 11792 1727096181.95474: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096181.95479: Calling groups_plugins_play to load vars for managed_node2 11792 1727096181.97027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096181.98626: done with get_vars() 11792 1727096181.98661: done getting variables 11792 1727096181.98723: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:56:21 -0400 (0:00:00.056) 0:01:04.266 ****** 11792 1727096181.98760: entering _queue_task() for managed_node2/fail 11792 1727096181.99131: worker is 1 (out of 1 available) 11792 1727096181.99143: exiting _queue_task() for managed_node2/fail 11792 1727096181.99157: done queuing things up, now waiting for results queue to drain 11792 1727096181.99158: waiting for pending results... 11792 1727096181.99588: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11792 1727096181.99624: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e0e 11792 1727096181.99646: variable 'ansible_search_path' from source: unknown 11792 1727096181.99653: variable 'ansible_search_path' from source: unknown 11792 1727096181.99699: calling self._execute() 11792 1727096181.99808: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096181.99821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096181.99835: variable 'omit' from source: magic vars 11792 1727096182.00205: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.00224: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096182.00413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096182.02745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096182.02941: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096182.02945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096182.02948: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096182.02950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096182.03019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.03054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.03090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.03133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.03152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.03259: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.03288: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11792 1727096182.03413: variable 'ansible_distribution' from source: facts 11792 1727096182.03423: variable '__network_rh_distros' from source: role '' defaults 11792 1727096182.03438: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11792 1727096182.03696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.03730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.03758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.03803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.03824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.03876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.03902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.03972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.03975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.03992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.04039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.04067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.04096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.04135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.04157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.04502: variable 'network_connections' from source: task vars 11792 1727096182.04572: variable 'port2_profile' from source: play vars 11792 1727096182.04592: variable 'port2_profile' from source: play vars 11792 1727096182.04607: variable 'port1_profile' from source: play vars 11792 1727096182.04670: variable 'port1_profile' from source: play vars 11792 1727096182.04685: variable 'controller_profile' from source: play vars 11792 1727096182.04748: variable 'controller_profile' from source: play vars 11792 1727096182.04763: variable 'network_state' from source: role '' defaults 11792 1727096182.04972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096182.05032: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096182.05076: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096182.05115: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096182.05147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096182.05212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096182.05238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096182.05270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.05303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096182.05337: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11792 1727096182.05345: when evaluation is False, skipping this task 11792 1727096182.05352: _execute() done 11792 1727096182.05359: dumping result to json 11792 1727096182.05366: done dumping result, returning 11792 1727096182.05383: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-d9c7-3fc0-000000000e0e] 11792 1727096182.05413: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0e skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11792 1727096182.05565: no more pending results, returning what we have 11792 1727096182.05571: results queue empty 11792 1727096182.05572: checking for any_errors_fatal 11792 1727096182.05580: done checking for any_errors_fatal 11792 1727096182.05581: checking for max_fail_percentage 11792 1727096182.05583: done checking for max_fail_percentage 11792 1727096182.05584: checking to see if all hosts have failed and the running result is not ok 11792 1727096182.05585: done checking to see if all hosts have failed 11792 1727096182.05585: getting the remaining hosts for this loop 11792 1727096182.05587: done getting the remaining hosts for this loop 11792 1727096182.05592: getting the next task for host managed_node2 11792 1727096182.05601: done getting next task for host managed_node2 11792 1727096182.05605: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11792 1727096182.05611: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096182.05632: getting variables 11792 1727096182.05634: in VariableManager get_vars() 11792 1727096182.05787: Calling all_inventory to load vars for managed_node2 11792 1727096182.05790: Calling groups_inventory to load vars for managed_node2 11792 1727096182.05792: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096182.05979: Calling all_plugins_play to load vars for managed_node2 11792 1727096182.05983: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096182.05986: Calling groups_plugins_play to load vars for managed_node2 11792 1727096182.06682: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0e 11792 1727096182.06686: WORKER PROCESS EXITING 11792 1727096182.07412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096182.09103: done with get_vars() 11792 1727096182.09127: done getting variables 11792 1727096182.09190: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:56:22 -0400 (0:00:00.104) 0:01:04.371 ****** 11792 1727096182.09226: entering _queue_task() for managed_node2/dnf 11792 1727096182.09597: worker is 1 (out of 1 available) 11792 1727096182.09611: exiting _queue_task() for managed_node2/dnf 11792 1727096182.09624: done queuing things up, now waiting for results queue to drain 11792 1727096182.09625: waiting for pending results... 11792 1727096182.09929: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11792 1727096182.10110: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e0f 11792 1727096182.10132: variable 'ansible_search_path' from source: unknown 11792 1727096182.10141: variable 'ansible_search_path' from source: unknown 11792 1727096182.10184: calling self._execute() 11792 1727096182.10289: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096182.10303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096182.10322: variable 'omit' from source: magic vars 11792 1727096182.10706: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.10751: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096182.10961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096182.15236: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096182.15319: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096182.15422: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096182.15462: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096182.15500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096182.15589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.15628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.15661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.15829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.15834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.15947: variable 'ansible_distribution' from source: facts 11792 1727096182.15960: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.15995: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11792 1727096182.16122: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096182.16273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.16307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.16339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.16389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.16407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.16573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.16576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.16579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.16581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.16583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.16615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.16642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.16672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.16723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.16749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.16927: variable 'network_connections' from source: task vars 11792 1727096182.16948: variable 'port2_profile' from source: play vars 11792 1727096182.17134: variable 'port2_profile' from source: play vars 11792 1727096182.17138: variable 'port1_profile' from source: play vars 11792 1727096182.17140: variable 'port1_profile' from source: play vars 11792 1727096182.17142: variable 'controller_profile' from source: play vars 11792 1727096182.17183: variable 'controller_profile' from source: play vars 11792 1727096182.17263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096182.17483: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096182.17527: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096182.17566: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096182.17605: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096182.17654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096182.17731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096182.17834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.17865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096182.18373: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096182.18639: variable 'network_connections' from source: task vars 11792 1727096182.19073: variable 'port2_profile' from source: play vars 11792 1727096182.19076: variable 'port2_profile' from source: play vars 11792 1727096182.19078: variable 'port1_profile' from source: play vars 11792 1727096182.19080: variable 'port1_profile' from source: play vars 11792 1727096182.19082: variable 'controller_profile' from source: play vars 11792 1727096182.19084: variable 'controller_profile' from source: play vars 11792 1727096182.19248: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096182.19256: when evaluation is False, skipping this task 11792 1727096182.19263: _execute() done 11792 1727096182.19272: dumping result to json 11792 1727096182.19279: done dumping result, returning 11792 1727096182.19291: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000e0f] 11792 1727096182.19298: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0f skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096182.19530: no more pending results, returning what we have 11792 1727096182.19535: results queue empty 11792 1727096182.19536: checking for any_errors_fatal 11792 1727096182.19544: done checking for any_errors_fatal 11792 1727096182.19545: checking for max_fail_percentage 11792 1727096182.19547: done checking for max_fail_percentage 11792 1727096182.19548: checking to see if all hosts have failed and the running result is not ok 11792 1727096182.19548: done checking to see if all hosts have failed 11792 1727096182.19549: getting the remaining hosts for this loop 11792 1727096182.19551: done getting the remaining hosts for this loop 11792 1727096182.19555: getting the next task for host managed_node2 11792 1727096182.19564: done getting next task for host managed_node2 11792 1727096182.19571: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11792 1727096182.19577: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096182.19599: getting variables 11792 1727096182.19600: in VariableManager get_vars() 11792 1727096182.19648: Calling all_inventory to load vars for managed_node2 11792 1727096182.19651: Calling groups_inventory to load vars for managed_node2 11792 1727096182.19653: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096182.19664: Calling all_plugins_play to load vars for managed_node2 11792 1727096182.19669: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096182.19672: Calling groups_plugins_play to load vars for managed_node2 11792 1727096182.20484: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e0f 11792 1727096182.20488: WORKER PROCESS EXITING 11792 1727096182.21410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096182.23621: done with get_vars() 11792 1727096182.23660: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11792 1727096182.23739: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:56:22 -0400 (0:00:00.147) 0:01:04.518 ****** 11792 1727096182.23980: entering _queue_task() for managed_node2/yum 11792 1727096182.24811: worker is 1 (out of 1 available) 11792 1727096182.24823: exiting _queue_task() for managed_node2/yum 11792 1727096182.24836: done queuing things up, now waiting for results queue to drain 11792 1727096182.24838: waiting for pending results... 11792 1727096182.25229: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11792 1727096182.25591: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e10 11792 1727096182.25622: variable 'ansible_search_path' from source: unknown 11792 1727096182.25631: variable 'ansible_search_path' from source: unknown 11792 1727096182.25679: calling self._execute() 11792 1727096182.25795: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096182.25814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096182.25836: variable 'omit' from source: magic vars 11792 1727096182.26281: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.26300: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096182.26579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096182.28934: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096182.29017: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096182.29065: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096182.29122: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096182.29154: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096182.29278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.29324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.29357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.29428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.29440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.29650: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.29653: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11792 1727096182.29659: when evaluation is False, skipping this task 11792 1727096182.29662: _execute() done 11792 1727096182.29665: dumping result to json 11792 1727096182.29669: done dumping result, returning 11792 1727096182.29673: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000e10] 11792 1727096182.29675: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e10 11792 1727096182.29753: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e10 11792 1727096182.29761: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11792 1727096182.29817: no more pending results, returning what we have 11792 1727096182.29821: results queue empty 11792 1727096182.29822: checking for any_errors_fatal 11792 1727096182.29828: done checking for any_errors_fatal 11792 1727096182.29829: checking for max_fail_percentage 11792 1727096182.29832: done checking for max_fail_percentage 11792 1727096182.29833: checking to see if all hosts have failed and the running result is not ok 11792 1727096182.29833: done checking to see if all hosts have failed 11792 1727096182.29834: getting the remaining hosts for this loop 11792 1727096182.29835: done getting the remaining hosts for this loop 11792 1727096182.29839: getting the next task for host managed_node2 11792 1727096182.29847: done getting next task for host managed_node2 11792 1727096182.29852: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11792 1727096182.29856: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096182.29879: getting variables 11792 1727096182.29880: in VariableManager get_vars() 11792 1727096182.29928: Calling all_inventory to load vars for managed_node2 11792 1727096182.29931: Calling groups_inventory to load vars for managed_node2 11792 1727096182.29933: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096182.29943: Calling all_plugins_play to load vars for managed_node2 11792 1727096182.29946: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096182.29948: Calling groups_plugins_play to load vars for managed_node2 11792 1727096182.31993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096182.33753: done with get_vars() 11792 1727096182.33803: done getting variables 11792 1727096182.33872: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:56:22 -0400 (0:00:00.099) 0:01:04.618 ****** 11792 1727096182.33934: entering _queue_task() for managed_node2/fail 11792 1727096182.34459: worker is 1 (out of 1 available) 11792 1727096182.34474: exiting _queue_task() for managed_node2/fail 11792 1727096182.34486: done queuing things up, now waiting for results queue to drain 11792 1727096182.34488: waiting for pending results... 11792 1727096182.34932: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11792 1727096182.35029: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e11 11792 1727096182.35245: variable 'ansible_search_path' from source: unknown 11792 1727096182.35249: variable 'ansible_search_path' from source: unknown 11792 1727096182.35354: calling self._execute() 11792 1727096182.35675: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096182.35679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096182.35682: variable 'omit' from source: magic vars 11792 1727096182.36206: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.36340: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096182.36576: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096182.37001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096182.41702: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096182.41803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096182.41993: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096182.42033: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096182.42090: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096182.42248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.42492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.42495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.42497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.42587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.42646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.42735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.42772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.42860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.42946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.42998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.43064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.43178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.43222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.43276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.43644: variable 'network_connections' from source: task vars 11792 1727096182.43665: variable 'port2_profile' from source: play vars 11792 1727096182.43872: variable 'port2_profile' from source: play vars 11792 1727096182.43875: variable 'port1_profile' from source: play vars 11792 1727096182.44126: variable 'port1_profile' from source: play vars 11792 1727096182.44130: variable 'controller_profile' from source: play vars 11792 1727096182.44237: variable 'controller_profile' from source: play vars 11792 1727096182.44320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096182.44773: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096182.44829: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096182.44964: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096182.44999: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096182.45095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096182.45177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096182.45206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.45284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096182.45337: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096182.45919: variable 'network_connections' from source: task vars 11792 1727096182.45931: variable 'port2_profile' from source: play vars 11792 1727096182.46087: variable 'port2_profile' from source: play vars 11792 1727096182.46100: variable 'port1_profile' from source: play vars 11792 1727096182.46342: variable 'port1_profile' from source: play vars 11792 1727096182.46345: variable 'controller_profile' from source: play vars 11792 1727096182.46380: variable 'controller_profile' from source: play vars 11792 1727096182.46494: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096182.46512: when evaluation is False, skipping this task 11792 1727096182.46519: _execute() done 11792 1727096182.46526: dumping result to json 11792 1727096182.46532: done dumping result, returning 11792 1727096182.46544: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000e11] 11792 1727096182.46552: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e11 11792 1727096182.46880: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e11 11792 1727096182.46883: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096182.46939: no more pending results, returning what we have 11792 1727096182.46943: results queue empty 11792 1727096182.46944: checking for any_errors_fatal 11792 1727096182.46953: done checking for any_errors_fatal 11792 1727096182.46953: checking for max_fail_percentage 11792 1727096182.46958: done checking for max_fail_percentage 11792 1727096182.46959: checking to see if all hosts have failed and the running result is not ok 11792 1727096182.46959: done checking to see if all hosts have failed 11792 1727096182.46960: getting the remaining hosts for this loop 11792 1727096182.46962: done getting the remaining hosts for this loop 11792 1727096182.46966: getting the next task for host managed_node2 11792 1727096182.46976: done getting next task for host managed_node2 11792 1727096182.46980: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11792 1727096182.46986: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096182.47007: getting variables 11792 1727096182.47009: in VariableManager get_vars() 11792 1727096182.47061: Calling all_inventory to load vars for managed_node2 11792 1727096182.47064: Calling groups_inventory to load vars for managed_node2 11792 1727096182.47067: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096182.47481: Calling all_plugins_play to load vars for managed_node2 11792 1727096182.47484: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096182.47487: Calling groups_plugins_play to load vars for managed_node2 11792 1727096182.50131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096182.53601: done with get_vars() 11792 1727096182.53638: done getting variables 11792 1727096182.53706: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:56:22 -0400 (0:00:00.198) 0:01:04.816 ****** 11792 1727096182.53744: entering _queue_task() for managed_node2/package 11792 1727096182.54541: worker is 1 (out of 1 available) 11792 1727096182.54559: exiting _queue_task() for managed_node2/package 11792 1727096182.54677: done queuing things up, now waiting for results queue to drain 11792 1727096182.54680: waiting for pending results... 11792 1727096182.55096: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11792 1727096182.55571: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e12 11792 1727096182.55585: variable 'ansible_search_path' from source: unknown 11792 1727096182.55589: variable 'ansible_search_path' from source: unknown 11792 1727096182.55675: calling self._execute() 11792 1727096182.55845: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096182.55852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096182.55861: variable 'omit' from source: magic vars 11792 1727096182.57002: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.57074: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096182.57519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096182.58172: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096182.58325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096182.58360: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096182.58393: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096182.58703: variable 'network_packages' from source: role '' defaults 11792 1727096182.58814: variable '__network_provider_setup' from source: role '' defaults 11792 1727096182.58845: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096182.59096: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096182.59099: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096182.59174: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096182.59775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096182.64401: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096182.64480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096182.64507: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096182.75869: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096182.75975: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096182.75979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.75999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.76025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.76071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.76085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.76129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.76159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.76183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.76220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.76235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.76628: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11792 1727096182.76632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.76635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.76642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.76682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.76700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.76792: variable 'ansible_python' from source: facts 11792 1727096182.76813: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11792 1727096182.76896: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096182.77013: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096182.77375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.77379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.77382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.77483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.77499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.77540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096182.77563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096182.77589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.77626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096182.77639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096182.77993: variable 'network_connections' from source: task vars 11792 1727096182.77999: variable 'port2_profile' from source: play vars 11792 1727096182.78305: variable 'port2_profile' from source: play vars 11792 1727096182.78316: variable 'port1_profile' from source: play vars 11792 1727096182.78424: variable 'port1_profile' from source: play vars 11792 1727096182.78435: variable 'controller_profile' from source: play vars 11792 1727096182.78740: variable 'controller_profile' from source: play vars 11792 1727096182.79024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096182.79053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096182.79088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096182.79118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096182.79162: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096182.79863: variable 'network_connections' from source: task vars 11792 1727096182.79869: variable 'port2_profile' from source: play vars 11792 1727096182.79977: variable 'port2_profile' from source: play vars 11792 1727096182.79990: variable 'port1_profile' from source: play vars 11792 1727096182.80289: variable 'port1_profile' from source: play vars 11792 1727096182.80299: variable 'controller_profile' from source: play vars 11792 1727096182.80397: variable 'controller_profile' from source: play vars 11792 1727096182.80431: variable '__network_packages_default_wireless' from source: role '' defaults 11792 1727096182.80712: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096182.81231: variable 'network_connections' from source: task vars 11792 1727096182.81235: variable 'port2_profile' from source: play vars 11792 1727096182.81508: variable 'port2_profile' from source: play vars 11792 1727096182.81516: variable 'port1_profile' from source: play vars 11792 1727096182.81579: variable 'port1_profile' from source: play vars 11792 1727096182.81587: variable 'controller_profile' from source: play vars 11792 1727096182.81647: variable 'controller_profile' from source: play vars 11792 1727096182.81876: variable '__network_packages_default_team' from source: role '' defaults 11792 1727096182.81952: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096182.82447: variable 'network_connections' from source: task vars 11792 1727096182.82452: variable 'port2_profile' from source: play vars 11792 1727096182.82776: variable 'port2_profile' from source: play vars 11792 1727096182.82780: variable 'port1_profile' from source: play vars 11792 1727096182.82991: variable 'port1_profile' from source: play vars 11792 1727096182.82995: variable 'controller_profile' from source: play vars 11792 1727096182.83061: variable 'controller_profile' from source: play vars 11792 1727096182.83115: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096182.83175: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096182.83182: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096182.83238: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096182.83885: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11792 1727096182.84986: variable 'network_connections' from source: task vars 11792 1727096182.84990: variable 'port2_profile' from source: play vars 11792 1727096182.84992: variable 'port2_profile' from source: play vars 11792 1727096182.84994: variable 'port1_profile' from source: play vars 11792 1727096182.84996: variable 'port1_profile' from source: play vars 11792 1727096182.84998: variable 'controller_profile' from source: play vars 11792 1727096182.85020: variable 'controller_profile' from source: play vars 11792 1727096182.85034: variable 'ansible_distribution' from source: facts 11792 1727096182.85037: variable '__network_rh_distros' from source: role '' defaults 11792 1727096182.85043: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.85061: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11792 1727096182.85265: variable 'ansible_distribution' from source: facts 11792 1727096182.85271: variable '__network_rh_distros' from source: role '' defaults 11792 1727096182.85276: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.85290: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11792 1727096182.85452: variable 'ansible_distribution' from source: facts 11792 1727096182.85458: variable '__network_rh_distros' from source: role '' defaults 11792 1727096182.85467: variable 'ansible_distribution_major_version' from source: facts 11792 1727096182.85508: variable 'network_provider' from source: set_fact 11792 1727096182.85524: variable 'ansible_facts' from source: unknown 11792 1727096182.86229: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11792 1727096182.86234: when evaluation is False, skipping this task 11792 1727096182.86237: _execute() done 11792 1727096182.86239: dumping result to json 11792 1727096182.86241: done dumping result, returning 11792 1727096182.86248: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-d9c7-3fc0-000000000e12] 11792 1727096182.86250: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e12 11792 1727096182.86349: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e12 11792 1727096182.86352: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11792 1727096182.86429: no more pending results, returning what we have 11792 1727096182.86432: results queue empty 11792 1727096182.86433: checking for any_errors_fatal 11792 1727096182.86438: done checking for any_errors_fatal 11792 1727096182.86439: checking for max_fail_percentage 11792 1727096182.86440: done checking for max_fail_percentage 11792 1727096182.86441: checking to see if all hosts have failed and the running result is not ok 11792 1727096182.86442: done checking to see if all hosts have failed 11792 1727096182.86442: getting the remaining hosts for this loop 11792 1727096182.86444: done getting the remaining hosts for this loop 11792 1727096182.86453: getting the next task for host managed_node2 11792 1727096182.86463: done getting next task for host managed_node2 11792 1727096182.86470: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11792 1727096182.86475: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096182.86494: getting variables 11792 1727096182.86495: in VariableManager get_vars() 11792 1727096182.86537: Calling all_inventory to load vars for managed_node2 11792 1727096182.86539: Calling groups_inventory to load vars for managed_node2 11792 1727096182.86541: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096182.86550: Calling all_plugins_play to load vars for managed_node2 11792 1727096182.86553: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096182.86558: Calling groups_plugins_play to load vars for managed_node2 11792 1727096183.02036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096183.04821: done with get_vars() 11792 1727096183.04871: done getting variables 11792 1727096183.04926: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:56:23 -0400 (0:00:00.512) 0:01:05.328 ****** 11792 1727096183.04966: entering _queue_task() for managed_node2/package 11792 1727096183.05354: worker is 1 (out of 1 available) 11792 1727096183.05367: exiting _queue_task() for managed_node2/package 11792 1727096183.05582: done queuing things up, now waiting for results queue to drain 11792 1727096183.05584: waiting for pending results... 11792 1727096183.05989: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11792 1727096183.05995: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e13 11792 1727096183.05998: variable 'ansible_search_path' from source: unknown 11792 1727096183.06003: variable 'ansible_search_path' from source: unknown 11792 1727096183.06005: calling self._execute() 11792 1727096183.06064: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096183.06070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096183.06081: variable 'omit' from source: magic vars 11792 1727096183.06481: variable 'ansible_distribution_major_version' from source: facts 11792 1727096183.06492: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096183.06780: variable 'network_state' from source: role '' defaults 11792 1727096183.06783: Evaluated conditional (network_state != {}): False 11792 1727096183.06785: when evaluation is False, skipping this task 11792 1727096183.06788: _execute() done 11792 1727096183.06791: dumping result to json 11792 1727096183.06794: done dumping result, returning 11792 1727096183.06801: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-d9c7-3fc0-000000000e13] 11792 1727096183.06803: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e13 11792 1727096183.06879: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e13 11792 1727096183.06882: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096183.06956: no more pending results, returning what we have 11792 1727096183.06960: results queue empty 11792 1727096183.06961: checking for any_errors_fatal 11792 1727096183.06971: done checking for any_errors_fatal 11792 1727096183.06971: checking for max_fail_percentage 11792 1727096183.06973: done checking for max_fail_percentage 11792 1727096183.06974: checking to see if all hosts have failed and the running result is not ok 11792 1727096183.06975: done checking to see if all hosts have failed 11792 1727096183.06975: getting the remaining hosts for this loop 11792 1727096183.06977: done getting the remaining hosts for this loop 11792 1727096183.06981: getting the next task for host managed_node2 11792 1727096183.06989: done getting next task for host managed_node2 11792 1727096183.06993: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11792 1727096183.06999: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096183.07025: getting variables 11792 1727096183.07028: in VariableManager get_vars() 11792 1727096183.07220: Calling all_inventory to load vars for managed_node2 11792 1727096183.07227: Calling groups_inventory to load vars for managed_node2 11792 1727096183.07230: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096183.07239: Calling all_plugins_play to load vars for managed_node2 11792 1727096183.07242: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096183.07245: Calling groups_plugins_play to load vars for managed_node2 11792 1727096183.08593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096183.10252: done with get_vars() 11792 1727096183.10286: done getting variables 11792 1727096183.10347: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:56:23 -0400 (0:00:00.054) 0:01:05.383 ****** 11792 1727096183.10386: entering _queue_task() for managed_node2/package 11792 1727096183.10805: worker is 1 (out of 1 available) 11792 1727096183.10821: exiting _queue_task() for managed_node2/package 11792 1727096183.10837: done queuing things up, now waiting for results queue to drain 11792 1727096183.10839: waiting for pending results... 11792 1727096183.11524: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11792 1727096183.11810: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e14 11792 1727096183.11875: variable 'ansible_search_path' from source: unknown 11792 1727096183.11880: variable 'ansible_search_path' from source: unknown 11792 1727096183.11883: calling self._execute() 11792 1727096183.12050: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096183.12054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096183.12093: variable 'omit' from source: magic vars 11792 1727096183.12780: variable 'ansible_distribution_major_version' from source: facts 11792 1727096183.12873: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096183.13211: variable 'network_state' from source: role '' defaults 11792 1727096183.13215: Evaluated conditional (network_state != {}): False 11792 1727096183.13218: when evaluation is False, skipping this task 11792 1727096183.13221: _execute() done 11792 1727096183.13224: dumping result to json 11792 1727096183.13226: done dumping result, returning 11792 1727096183.13228: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-d9c7-3fc0-000000000e14] 11792 1727096183.13231: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e14 11792 1727096183.13390: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e14 11792 1727096183.13394: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096183.13457: no more pending results, returning what we have 11792 1727096183.13462: results queue empty 11792 1727096183.13463: checking for any_errors_fatal 11792 1727096183.13472: done checking for any_errors_fatal 11792 1727096183.13473: checking for max_fail_percentage 11792 1727096183.13475: done checking for max_fail_percentage 11792 1727096183.13476: checking to see if all hosts have failed and the running result is not ok 11792 1727096183.13477: done checking to see if all hosts have failed 11792 1727096183.13478: getting the remaining hosts for this loop 11792 1727096183.13479: done getting the remaining hosts for this loop 11792 1727096183.13483: getting the next task for host managed_node2 11792 1727096183.13492: done getting next task for host managed_node2 11792 1727096183.13498: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11792 1727096183.13505: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096183.13527: getting variables 11792 1727096183.13528: in VariableManager get_vars() 11792 1727096183.13579: Calling all_inventory to load vars for managed_node2 11792 1727096183.13582: Calling groups_inventory to load vars for managed_node2 11792 1727096183.13584: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096183.13597: Calling all_plugins_play to load vars for managed_node2 11792 1727096183.13599: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096183.13601: Calling groups_plugins_play to load vars for managed_node2 11792 1727096183.16916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096183.20372: done with get_vars() 11792 1727096183.20408: done getting variables 11792 1727096183.20462: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:56:23 -0400 (0:00:00.101) 0:01:05.484 ****** 11792 1727096183.20503: entering _queue_task() for managed_node2/service 11792 1727096183.21276: worker is 1 (out of 1 available) 11792 1727096183.21291: exiting _queue_task() for managed_node2/service 11792 1727096183.21305: done queuing things up, now waiting for results queue to drain 11792 1727096183.21307: waiting for pending results... 11792 1727096183.22361: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11792 1727096183.22492: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e15 11792 1727096183.22504: variable 'ansible_search_path' from source: unknown 11792 1727096183.22573: variable 'ansible_search_path' from source: unknown 11792 1727096183.22578: calling self._execute() 11792 1727096183.22654: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096183.22669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096183.22684: variable 'omit' from source: magic vars 11792 1727096183.23072: variable 'ansible_distribution_major_version' from source: facts 11792 1727096183.23089: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096183.23217: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096183.23776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096183.27942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096183.28035: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096183.28219: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096183.28258: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096183.28380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096183.28462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096183.28547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096183.28657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.28709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096183.28753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096183.28897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096183.28924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096183.29073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.29117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096183.29137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096183.29252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096183.29288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096183.29317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.29589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096183.29592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096183.30074: variable 'network_connections' from source: task vars 11792 1727096183.30077: variable 'port2_profile' from source: play vars 11792 1727096183.30214: variable 'port2_profile' from source: play vars 11792 1727096183.30288: variable 'port1_profile' from source: play vars 11792 1727096183.30363: variable 'port1_profile' from source: play vars 11792 1727096183.30485: variable 'controller_profile' from source: play vars 11792 1727096183.30557: variable 'controller_profile' from source: play vars 11792 1727096183.30708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096183.31165: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096183.31325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096183.31366: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096183.31409: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096183.31526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096183.31600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096183.31719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.31723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096183.31825: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096183.32359: variable 'network_connections' from source: task vars 11792 1727096183.32575: variable 'port2_profile' from source: play vars 11792 1727096183.32578: variable 'port2_profile' from source: play vars 11792 1727096183.32580: variable 'port1_profile' from source: play vars 11792 1727096183.32630: variable 'port1_profile' from source: play vars 11792 1727096183.32645: variable 'controller_profile' from source: play vars 11792 1727096183.32709: variable 'controller_profile' from source: play vars 11792 1727096183.32973: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11792 1727096183.32985: when evaluation is False, skipping this task 11792 1727096183.32987: _execute() done 11792 1727096183.32990: dumping result to json 11792 1727096183.32993: done dumping result, returning 11792 1727096183.32995: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-d9c7-3fc0-000000000e15] 11792 1727096183.32997: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e15 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11792 1727096183.33159: no more pending results, returning what we have 11792 1727096183.33163: results queue empty 11792 1727096183.33164: checking for any_errors_fatal 11792 1727096183.33172: done checking for any_errors_fatal 11792 1727096183.33173: checking for max_fail_percentage 11792 1727096183.33176: done checking for max_fail_percentage 11792 1727096183.33177: checking to see if all hosts have failed and the running result is not ok 11792 1727096183.33177: done checking to see if all hosts have failed 11792 1727096183.33178: getting the remaining hosts for this loop 11792 1727096183.33179: done getting the remaining hosts for this loop 11792 1727096183.33183: getting the next task for host managed_node2 11792 1727096183.33193: done getting next task for host managed_node2 11792 1727096183.33198: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11792 1727096183.33204: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096183.33227: getting variables 11792 1727096183.33229: in VariableManager get_vars() 11792 1727096183.33283: Calling all_inventory to load vars for managed_node2 11792 1727096183.33285: Calling groups_inventory to load vars for managed_node2 11792 1727096183.33288: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096183.33298: Calling all_plugins_play to load vars for managed_node2 11792 1727096183.33302: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096183.33305: Calling groups_plugins_play to load vars for managed_node2 11792 1727096183.34184: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e15 11792 1727096183.34188: WORKER PROCESS EXITING 11792 1727096183.34960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096183.36682: done with get_vars() 11792 1727096183.36705: done getting variables 11792 1727096183.36770: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:56:23 -0400 (0:00:00.163) 0:01:05.647 ****** 11792 1727096183.36807: entering _queue_task() for managed_node2/service 11792 1727096183.37169: worker is 1 (out of 1 available) 11792 1727096183.37181: exiting _queue_task() for managed_node2/service 11792 1727096183.37195: done queuing things up, now waiting for results queue to drain 11792 1727096183.37197: waiting for pending results... 11792 1727096183.37496: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11792 1727096183.37669: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e16 11792 1727096183.37689: variable 'ansible_search_path' from source: unknown 11792 1727096183.37696: variable 'ansible_search_path' from source: unknown 11792 1727096183.37738: calling self._execute() 11792 1727096183.37837: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096183.37849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096183.37863: variable 'omit' from source: magic vars 11792 1727096183.38234: variable 'ansible_distribution_major_version' from source: facts 11792 1727096183.38250: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096183.38417: variable 'network_provider' from source: set_fact 11792 1727096183.38426: variable 'network_state' from source: role '' defaults 11792 1727096183.38441: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11792 1727096183.38452: variable 'omit' from source: magic vars 11792 1727096183.38531: variable 'omit' from source: magic vars 11792 1727096183.38565: variable 'network_service_name' from source: role '' defaults 11792 1727096183.38640: variable 'network_service_name' from source: role '' defaults 11792 1727096183.38750: variable '__network_provider_setup' from source: role '' defaults 11792 1727096183.38773: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096183.38916: variable '__network_service_name_default_nm' from source: role '' defaults 11792 1727096183.38919: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096183.38921: variable '__network_packages_default_nm' from source: role '' defaults 11792 1727096183.39150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096183.41336: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096183.41422: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096183.41462: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096183.41506: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096183.41541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096183.41619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096183.41974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096183.41978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.41981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096183.42082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096183.42086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096183.42088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096183.42097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.42140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096183.42161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096183.42703: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11792 1727096183.42945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096183.43083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096183.43112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.43155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096183.43182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096183.43366: variable 'ansible_python' from source: facts 11792 1727096183.43599: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11792 1727096183.43602: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096183.43752: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096183.44054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096183.44089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096183.44173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.44217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096183.44471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096183.44475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096183.44486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096183.44683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.44686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096183.44689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096183.45028: variable 'network_connections' from source: task vars 11792 1727096183.45043: variable 'port2_profile' from source: play vars 11792 1727096183.45185: variable 'port2_profile' from source: play vars 11792 1727096183.45204: variable 'port1_profile' from source: play vars 11792 1727096183.45530: variable 'port1_profile' from source: play vars 11792 1727096183.45639: variable 'controller_profile' from source: play vars 11792 1727096183.45642: variable 'controller_profile' from source: play vars 11792 1727096183.45736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096183.46009: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096183.46063: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096183.46117: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096183.46162: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096183.46237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096183.46274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096183.46315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096183.46352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096183.46417: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096183.46723: variable 'network_connections' from source: task vars 11792 1727096183.46761: variable 'port2_profile' from source: play vars 11792 1727096183.46860: variable 'port2_profile' from source: play vars 11792 1727096183.46900: variable 'port1_profile' from source: play vars 11792 1727096183.46979: variable 'port1_profile' from source: play vars 11792 1727096183.46997: variable 'controller_profile' from source: play vars 11792 1727096183.47112: variable 'controller_profile' from source: play vars 11792 1727096183.47151: variable '__network_packages_default_wireless' from source: role '' defaults 11792 1727096183.47247: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096183.47553: variable 'network_connections' from source: task vars 11792 1727096183.47565: variable 'port2_profile' from source: play vars 11792 1727096183.47644: variable 'port2_profile' from source: play vars 11792 1727096183.47658: variable 'port1_profile' from source: play vars 11792 1727096183.47737: variable 'port1_profile' from source: play vars 11792 1727096183.47750: variable 'controller_profile' from source: play vars 11792 1727096183.47824: variable 'controller_profile' from source: play vars 11792 1727096183.47855: variable '__network_packages_default_team' from source: role '' defaults 11792 1727096183.47938: variable '__network_team_connections_defined' from source: role '' defaults 11792 1727096183.48246: variable 'network_connections' from source: task vars 11792 1727096183.48261: variable 'port2_profile' from source: play vars 11792 1727096183.48336: variable 'port2_profile' from source: play vars 11792 1727096183.48349: variable 'port1_profile' from source: play vars 11792 1727096183.48424: variable 'port1_profile' from source: play vars 11792 1727096183.48437: variable 'controller_profile' from source: play vars 11792 1727096183.48512: variable 'controller_profile' from source: play vars 11792 1727096183.48571: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096183.48636: variable '__network_service_name_default_initscripts' from source: role '' defaults 11792 1727096183.48674: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096183.48827: variable '__network_packages_default_initscripts' from source: role '' defaults 11792 1727096183.49246: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11792 1727096183.49730: variable 'network_connections' from source: task vars 11792 1727096183.49740: variable 'port2_profile' from source: play vars 11792 1727096183.49805: variable 'port2_profile' from source: play vars 11792 1727096183.49818: variable 'port1_profile' from source: play vars 11792 1727096183.49880: variable 'port1_profile' from source: play vars 11792 1727096183.49896: variable 'controller_profile' from source: play vars 11792 1727096183.49957: variable 'controller_profile' from source: play vars 11792 1727096183.49975: variable 'ansible_distribution' from source: facts 11792 1727096183.49984: variable '__network_rh_distros' from source: role '' defaults 11792 1727096183.50000: variable 'ansible_distribution_major_version' from source: facts 11792 1727096183.50020: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11792 1727096183.50198: variable 'ansible_distribution' from source: facts 11792 1727096183.50208: variable '__network_rh_distros' from source: role '' defaults 11792 1727096183.50223: variable 'ansible_distribution_major_version' from source: facts 11792 1727096183.50267: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11792 1727096183.50440: variable 'ansible_distribution' from source: facts 11792 1727096183.50449: variable '__network_rh_distros' from source: role '' defaults 11792 1727096183.50458: variable 'ansible_distribution_major_version' from source: facts 11792 1727096183.50524: variable 'network_provider' from source: set_fact 11792 1727096183.50556: variable 'omit' from source: magic vars 11792 1727096183.50591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096183.50645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096183.50774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096183.50777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096183.50779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096183.50781: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096183.50784: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096183.50785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096183.51013: Set connection var ansible_timeout to 10 11792 1727096183.51016: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096183.51018: Set connection var ansible_shell_executable to /bin/sh 11792 1727096183.51020: Set connection var ansible_pipelining to False 11792 1727096183.51022: Set connection var ansible_shell_type to sh 11792 1727096183.51024: Set connection var ansible_connection to ssh 11792 1727096183.51026: variable 'ansible_shell_executable' from source: unknown 11792 1727096183.51028: variable 'ansible_connection' from source: unknown 11792 1727096183.51029: variable 'ansible_module_compression' from source: unknown 11792 1727096183.51032: variable 'ansible_shell_type' from source: unknown 11792 1727096183.51034: variable 'ansible_shell_executable' from source: unknown 11792 1727096183.51035: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096183.51037: variable 'ansible_pipelining' from source: unknown 11792 1727096183.51039: variable 'ansible_timeout' from source: unknown 11792 1727096183.51040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096183.51148: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096183.51165: variable 'omit' from source: magic vars 11792 1727096183.51178: starting attempt loop 11792 1727096183.51230: running the handler 11792 1727096183.51285: variable 'ansible_facts' from source: unknown 11792 1727096183.52112: _low_level_execute_command(): starting 11792 1727096183.52116: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096183.52620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096183.52625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096183.52629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.52674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096183.52689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096183.52735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096183.54472: stdout chunk (state=3): >>>/root <<< 11792 1727096183.54596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096183.54624: stderr chunk (state=3): >>><<< 11792 1727096183.54627: stdout chunk (state=3): >>><<< 11792 1727096183.54680: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096183.54689: _low_level_execute_command(): starting 11792 1727096183.54694: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501 `" && echo ansible-tmp-1727096183.5465345-14770-115128617034501="` echo /root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501 `" ) && sleep 0' 11792 1727096183.55164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096183.55170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.55173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096183.55176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.55230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096183.55235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096183.55237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096183.55275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096183.57386: stdout chunk (state=3): >>>ansible-tmp-1727096183.5465345-14770-115128617034501=/root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501 <<< 11792 1727096183.57510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096183.57514: stdout chunk (state=3): >>><<< 11792 1727096183.57517: stderr chunk (state=3): >>><<< 11792 1727096183.57676: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096183.5465345-14770-115128617034501=/root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096183.57684: variable 'ansible_module_compression' from source: unknown 11792 1727096183.57687: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11792 1727096183.57718: variable 'ansible_facts' from source: unknown 11792 1727096183.57971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/AnsiballZ_systemd.py 11792 1727096183.58150: Sending initial data 11792 1727096183.58170: Sent initial data (156 bytes) 11792 1727096183.58623: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096183.58628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096183.58659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.58663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096183.58665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096183.58670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.58720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096183.58723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096183.58726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096183.58786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096183.60516: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096183.60557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096183.60619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/AnsiballZ_systemd.py" <<< 11792 1727096183.60623: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmph2naw2k2 /root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/AnsiballZ_systemd.py <<< 11792 1727096183.60651: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmph2naw2k2" to remote "/root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/AnsiballZ_systemd.py" <<< 11792 1727096183.62015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096183.62072: stderr chunk (state=3): >>><<< 11792 1727096183.62183: stdout chunk (state=3): >>><<< 11792 1727096183.62187: done transferring module to remote 11792 1727096183.62189: _low_level_execute_command(): starting 11792 1727096183.62191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/ /root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/AnsiballZ_systemd.py && sleep 0' 11792 1727096183.62727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096183.62731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096183.62734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.62779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096183.62798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096183.62827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096183.64763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096183.64769: stdout chunk (state=3): >>><<< 11792 1727096183.64772: stderr chunk (state=3): >>><<< 11792 1727096183.64975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096183.64979: _low_level_execute_command(): starting 11792 1727096183.64982: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/AnsiballZ_systemd.py && sleep 0' 11792 1727096183.65411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.65420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096183.65451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.65492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096183.65504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096183.65550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096183.96662: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4579328", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318243328", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "887573000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 11792 1727096183.96682: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "syste<<< 11792 1727096183.96691: stdout chunk (state=3): >>>md-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11792 1727096183.98740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096183.98771: stderr chunk (state=3): >>><<< 11792 1727096183.98775: stdout chunk (state=3): >>><<< 11792 1727096183.98793: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4579328", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318243328", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "887573000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096183.98918: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096183.98933: _low_level_execute_command(): starting 11792 1727096183.98938: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096183.5465345-14770-115128617034501/ > /dev/null 2>&1 && sleep 0' 11792 1727096183.99403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096183.99407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096183.99409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096183.99411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096183.99413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096183.99478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096183.99480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096183.99482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096183.99513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096184.01575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096184.01579: stdout chunk (state=3): >>><<< 11792 1727096184.01581: stderr chunk (state=3): >>><<< 11792 1727096184.01583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096184.01586: handler run complete 11792 1727096184.01588: attempt loop complete, returning result 11792 1727096184.01589: _execute() done 11792 1727096184.01591: dumping result to json 11792 1727096184.01593: done dumping result, returning 11792 1727096184.01600: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-d9c7-3fc0-000000000e16] 11792 1727096184.01603: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e16 11792 1727096184.01899: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e16 11792 1727096184.01903: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096184.01960: no more pending results, returning what we have 11792 1727096184.01964: results queue empty 11792 1727096184.01965: checking for any_errors_fatal 11792 1727096184.01975: done checking for any_errors_fatal 11792 1727096184.01975: checking for max_fail_percentage 11792 1727096184.01977: done checking for max_fail_percentage 11792 1727096184.01978: checking to see if all hosts have failed and the running result is not ok 11792 1727096184.01978: done checking to see if all hosts have failed 11792 1727096184.01979: getting the remaining hosts for this loop 11792 1727096184.01981: done getting the remaining hosts for this loop 11792 1727096184.01984: getting the next task for host managed_node2 11792 1727096184.01991: done getting next task for host managed_node2 11792 1727096184.01995: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11792 1727096184.02000: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096184.02012: getting variables 11792 1727096184.02013: in VariableManager get_vars() 11792 1727096184.02052: Calling all_inventory to load vars for managed_node2 11792 1727096184.02055: Calling groups_inventory to load vars for managed_node2 11792 1727096184.02056: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096184.02066: Calling all_plugins_play to load vars for managed_node2 11792 1727096184.02176: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096184.02181: Calling groups_plugins_play to load vars for managed_node2 11792 1727096184.03488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096184.05186: done with get_vars() 11792 1727096184.05219: done getting variables 11792 1727096184.05297: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:56:24 -0400 (0:00:00.685) 0:01:06.332 ****** 11792 1727096184.05343: entering _queue_task() for managed_node2/service 11792 1727096184.05751: worker is 1 (out of 1 available) 11792 1727096184.05765: exiting _queue_task() for managed_node2/service 11792 1727096184.05981: done queuing things up, now waiting for results queue to drain 11792 1727096184.05983: waiting for pending results... 11792 1727096184.06296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11792 1727096184.06302: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e17 11792 1727096184.06313: variable 'ansible_search_path' from source: unknown 11792 1727096184.06321: variable 'ansible_search_path' from source: unknown 11792 1727096184.06366: calling self._execute() 11792 1727096184.06477: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096184.06491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096184.06513: variable 'omit' from source: magic vars 11792 1727096184.06892: variable 'ansible_distribution_major_version' from source: facts 11792 1727096184.06908: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096184.07030: variable 'network_provider' from source: set_fact 11792 1727096184.07044: Evaluated conditional (network_provider == "nm"): True 11792 1727096184.07132: variable '__network_wpa_supplicant_required' from source: role '' defaults 11792 1727096184.07260: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11792 1727096184.07413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096184.10331: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096184.10409: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096184.10509: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096184.10513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096184.10523: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096184.10608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096184.10647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096184.10726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096184.10730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096184.10745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096184.10799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096184.10827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096184.10864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096184.10910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096184.10928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096184.11055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096184.11062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096184.11064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096184.11080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096184.11098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096184.11248: variable 'network_connections' from source: task vars 11792 1727096184.11275: variable 'port2_profile' from source: play vars 11792 1727096184.11346: variable 'port2_profile' from source: play vars 11792 1727096184.11370: variable 'port1_profile' from source: play vars 11792 1727096184.11439: variable 'port1_profile' from source: play vars 11792 1727096184.11453: variable 'controller_profile' from source: play vars 11792 1727096184.11522: variable 'controller_profile' from source: play vars 11792 1727096184.11624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11792 1727096184.11807: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11792 1727096184.11973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11792 1727096184.11976: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11792 1727096184.11978: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11792 1727096184.11981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11792 1727096184.11987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11792 1727096184.12016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096184.12045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11792 1727096184.12107: variable '__network_wireless_connections_defined' from source: role '' defaults 11792 1727096184.12360: variable 'network_connections' from source: task vars 11792 1727096184.12373: variable 'port2_profile' from source: play vars 11792 1727096184.12439: variable 'port2_profile' from source: play vars 11792 1727096184.12451: variable 'port1_profile' from source: play vars 11792 1727096184.12516: variable 'port1_profile' from source: play vars 11792 1727096184.12541: variable 'controller_profile' from source: play vars 11792 1727096184.12606: variable 'controller_profile' from source: play vars 11792 1727096184.12644: Evaluated conditional (__network_wpa_supplicant_required): False 11792 1727096184.12652: when evaluation is False, skipping this task 11792 1727096184.12662: _execute() done 11792 1727096184.12671: dumping result to json 11792 1727096184.12750: done dumping result, returning 11792 1727096184.12753: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-d9c7-3fc0-000000000e17] 11792 1727096184.12758: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e17 11792 1727096184.12829: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e17 11792 1727096184.12832: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11792 1727096184.12908: no more pending results, returning what we have 11792 1727096184.12912: results queue empty 11792 1727096184.12913: checking for any_errors_fatal 11792 1727096184.12937: done checking for any_errors_fatal 11792 1727096184.12938: checking for max_fail_percentage 11792 1727096184.12940: done checking for max_fail_percentage 11792 1727096184.12940: checking to see if all hosts have failed and the running result is not ok 11792 1727096184.12941: done checking to see if all hosts have failed 11792 1727096184.12942: getting the remaining hosts for this loop 11792 1727096184.12944: done getting the remaining hosts for this loop 11792 1727096184.12948: getting the next task for host managed_node2 11792 1727096184.12961: done getting next task for host managed_node2 11792 1727096184.12965: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11792 1727096184.12973: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096184.12995: getting variables 11792 1727096184.12997: in VariableManager get_vars() 11792 1727096184.13047: Calling all_inventory to load vars for managed_node2 11792 1727096184.13050: Calling groups_inventory to load vars for managed_node2 11792 1727096184.13053: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096184.13066: Calling all_plugins_play to load vars for managed_node2 11792 1727096184.13277: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096184.13281: Calling groups_plugins_play to load vars for managed_node2 11792 1727096184.15117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096184.16720: done with get_vars() 11792 1727096184.16756: done getting variables 11792 1727096184.16818: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:56:24 -0400 (0:00:00.115) 0:01:06.447 ****** 11792 1727096184.16859: entering _queue_task() for managed_node2/service 11792 1727096184.17243: worker is 1 (out of 1 available) 11792 1727096184.17256: exiting _queue_task() for managed_node2/service 11792 1727096184.17272: done queuing things up, now waiting for results queue to drain 11792 1727096184.17274: waiting for pending results... 11792 1727096184.17787: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11792 1727096184.17795: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e18 11792 1727096184.17798: variable 'ansible_search_path' from source: unknown 11792 1727096184.17802: variable 'ansible_search_path' from source: unknown 11792 1727096184.17805: calling self._execute() 11792 1727096184.17905: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096184.17922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096184.17936: variable 'omit' from source: magic vars 11792 1727096184.18342: variable 'ansible_distribution_major_version' from source: facts 11792 1727096184.18365: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096184.18496: variable 'network_provider' from source: set_fact 11792 1727096184.18508: Evaluated conditional (network_provider == "initscripts"): False 11792 1727096184.18516: when evaluation is False, skipping this task 11792 1727096184.18524: _execute() done 11792 1727096184.18531: dumping result to json 11792 1727096184.18539: done dumping result, returning 11792 1727096184.18552: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-d9c7-3fc0-000000000e18] 11792 1727096184.18560: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e18 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11792 1727096184.18834: no more pending results, returning what we have 11792 1727096184.18839: results queue empty 11792 1727096184.18840: checking for any_errors_fatal 11792 1727096184.18852: done checking for any_errors_fatal 11792 1727096184.18853: checking for max_fail_percentage 11792 1727096184.18854: done checking for max_fail_percentage 11792 1727096184.18856: checking to see if all hosts have failed and the running result is not ok 11792 1727096184.18856: done checking to see if all hosts have failed 11792 1727096184.18857: getting the remaining hosts for this loop 11792 1727096184.18859: done getting the remaining hosts for this loop 11792 1727096184.18863: getting the next task for host managed_node2 11792 1727096184.18880: done getting next task for host managed_node2 11792 1727096184.18886: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11792 1727096184.18893: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096184.19001: getting variables 11792 1727096184.19003: in VariableManager get_vars() 11792 1727096184.19065: Calling all_inventory to load vars for managed_node2 11792 1727096184.19172: Calling groups_inventory to load vars for managed_node2 11792 1727096184.19176: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096184.19188: Calling all_plugins_play to load vars for managed_node2 11792 1727096184.19191: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096184.19194: Calling groups_plugins_play to load vars for managed_node2 11792 1727096184.19211: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e18 11792 1727096184.19214: WORKER PROCESS EXITING 11792 1727096184.20647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096184.22348: done with get_vars() 11792 1727096184.22383: done getting variables 11792 1727096184.22443: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:56:24 -0400 (0:00:00.056) 0:01:06.504 ****** 11792 1727096184.22489: entering _queue_task() for managed_node2/copy 11792 1727096184.22916: worker is 1 (out of 1 available) 11792 1727096184.22929: exiting _queue_task() for managed_node2/copy 11792 1727096184.22941: done queuing things up, now waiting for results queue to drain 11792 1727096184.22943: waiting for pending results... 11792 1727096184.23204: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11792 1727096184.23389: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e19 11792 1727096184.23410: variable 'ansible_search_path' from source: unknown 11792 1727096184.23417: variable 'ansible_search_path' from source: unknown 11792 1727096184.23466: calling self._execute() 11792 1727096184.23578: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096184.23651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096184.23655: variable 'omit' from source: magic vars 11792 1727096184.24009: variable 'ansible_distribution_major_version' from source: facts 11792 1727096184.24025: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096184.24135: variable 'network_provider' from source: set_fact 11792 1727096184.24146: Evaluated conditional (network_provider == "initscripts"): False 11792 1727096184.24152: when evaluation is False, skipping this task 11792 1727096184.24158: _execute() done 11792 1727096184.24165: dumping result to json 11792 1727096184.24174: done dumping result, returning 11792 1727096184.24196: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-d9c7-3fc0-000000000e19] 11792 1727096184.24208: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e19 11792 1727096184.24527: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e19 11792 1727096184.24531: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11792 1727096184.24588: no more pending results, returning what we have 11792 1727096184.24592: results queue empty 11792 1727096184.24593: checking for any_errors_fatal 11792 1727096184.24601: done checking for any_errors_fatal 11792 1727096184.24602: checking for max_fail_percentage 11792 1727096184.24604: done checking for max_fail_percentage 11792 1727096184.24605: checking to see if all hosts have failed and the running result is not ok 11792 1727096184.24606: done checking to see if all hosts have failed 11792 1727096184.24606: getting the remaining hosts for this loop 11792 1727096184.24608: done getting the remaining hosts for this loop 11792 1727096184.24611: getting the next task for host managed_node2 11792 1727096184.24622: done getting next task for host managed_node2 11792 1727096184.24626: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11792 1727096184.24633: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096184.24661: getting variables 11792 1727096184.24663: in VariableManager get_vars() 11792 1727096184.24721: Calling all_inventory to load vars for managed_node2 11792 1727096184.24724: Calling groups_inventory to load vars for managed_node2 11792 1727096184.24727: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096184.24740: Calling all_plugins_play to load vars for managed_node2 11792 1727096184.24744: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096184.24747: Calling groups_plugins_play to load vars for managed_node2 11792 1727096184.26370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096184.27928: done with get_vars() 11792 1727096184.27961: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:56:24 -0400 (0:00:00.055) 0:01:06.559 ****** 11792 1727096184.28058: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11792 1727096184.28427: worker is 1 (out of 1 available) 11792 1727096184.28439: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11792 1727096184.28453: done queuing things up, now waiting for results queue to drain 11792 1727096184.28455: waiting for pending results... 11792 1727096184.28758: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11792 1727096184.28925: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e1a 11792 1727096184.29073: variable 'ansible_search_path' from source: unknown 11792 1727096184.29077: variable 'ansible_search_path' from source: unknown 11792 1727096184.29081: calling self._execute() 11792 1727096184.29104: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096184.29116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096184.29131: variable 'omit' from source: magic vars 11792 1727096184.29519: variable 'ansible_distribution_major_version' from source: facts 11792 1727096184.29541: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096184.29553: variable 'omit' from source: magic vars 11792 1727096184.29642: variable 'omit' from source: magic vars 11792 1727096184.29814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11792 1727096184.32042: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11792 1727096184.32120: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11792 1727096184.32170: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11792 1727096184.32240: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11792 1727096184.32243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11792 1727096184.32327: variable 'network_provider' from source: set_fact 11792 1727096184.32475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11792 1727096184.32508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11792 1727096184.32538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11792 1727096184.32673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11792 1727096184.32677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11792 1727096184.32689: variable 'omit' from source: magic vars 11792 1727096184.32804: variable 'omit' from source: magic vars 11792 1727096184.32916: variable 'network_connections' from source: task vars 11792 1727096184.32933: variable 'port2_profile' from source: play vars 11792 1727096184.32997: variable 'port2_profile' from source: play vars 11792 1727096184.33015: variable 'port1_profile' from source: play vars 11792 1727096184.33079: variable 'port1_profile' from source: play vars 11792 1727096184.33092: variable 'controller_profile' from source: play vars 11792 1727096184.33153: variable 'controller_profile' from source: play vars 11792 1727096184.33683: variable 'omit' from source: magic vars 11792 1727096184.33687: variable '__lsr_ansible_managed' from source: task vars 11792 1727096184.33689: variable '__lsr_ansible_managed' from source: task vars 11792 1727096184.34030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11792 1727096184.34491: Loaded config def from plugin (lookup/template) 11792 1727096184.34561: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11792 1727096184.34596: File lookup term: get_ansible_managed.j2 11792 1727096184.34604: variable 'ansible_search_path' from source: unknown 11792 1727096184.34874: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11792 1727096184.34880: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11792 1727096184.34883: variable 'ansible_search_path' from source: unknown 11792 1727096184.42192: variable 'ansible_managed' from source: unknown 11792 1727096184.42301: variable 'omit' from source: magic vars 11792 1727096184.42360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096184.42398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096184.42447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096184.42681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096184.42685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096184.42687: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096184.42689: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096184.42692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096184.42693: Set connection var ansible_timeout to 10 11792 1727096184.42695: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096184.42697: Set connection var ansible_shell_executable to /bin/sh 11792 1727096184.42698: Set connection var ansible_pipelining to False 11792 1727096184.42700: Set connection var ansible_shell_type to sh 11792 1727096184.42701: Set connection var ansible_connection to ssh 11792 1727096184.42720: variable 'ansible_shell_executable' from source: unknown 11792 1727096184.42728: variable 'ansible_connection' from source: unknown 11792 1727096184.42734: variable 'ansible_module_compression' from source: unknown 11792 1727096184.42740: variable 'ansible_shell_type' from source: unknown 11792 1727096184.42745: variable 'ansible_shell_executable' from source: unknown 11792 1727096184.42750: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096184.42760: variable 'ansible_pipelining' from source: unknown 11792 1727096184.42766: variable 'ansible_timeout' from source: unknown 11792 1727096184.42785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096184.42936: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096184.42951: variable 'omit' from source: magic vars 11792 1727096184.42966: starting attempt loop 11792 1727096184.43041: running the handler 11792 1727096184.43044: _low_level_execute_command(): starting 11792 1727096184.43047: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096184.43668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096184.43693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096184.43732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096184.43744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096184.43793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096184.45493: stdout chunk (state=3): >>>/root <<< 11792 1727096184.45773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096184.45777: stdout chunk (state=3): >>><<< 11792 1727096184.45780: stderr chunk (state=3): >>><<< 11792 1727096184.45782: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096184.45785: _low_level_execute_command(): starting 11792 1727096184.45790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737 `" && echo ansible-tmp-1727096184.4568796-14799-119891755639737="` echo /root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737 `" ) && sleep 0' 11792 1727096184.46376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096184.46380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096184.46395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096184.46415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096184.46427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096184.46433: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096184.46443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096184.46460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096184.46463: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096184.46578: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096184.46581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096184.46584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096184.46591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096184.46662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096184.48920: stdout chunk (state=3): >>>ansible-tmp-1727096184.4568796-14799-119891755639737=/root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737 <<< 11792 1727096184.49177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096184.49181: stdout chunk (state=3): >>><<< 11792 1727096184.49183: stderr chunk (state=3): >>><<< 11792 1727096184.49186: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096184.4568796-14799-119891755639737=/root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096184.49189: variable 'ansible_module_compression' from source: unknown 11792 1727096184.49212: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11792 1727096184.49262: variable 'ansible_facts' from source: unknown 11792 1727096184.49413: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/AnsiballZ_network_connections.py 11792 1727096184.49573: Sending initial data 11792 1727096184.49582: Sent initial data (168 bytes) 11792 1727096184.50203: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096184.50222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096184.50265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096184.50349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096184.51996: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096184.52023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096184.52066: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp_0ag3flj /root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/AnsiballZ_network_connections.py <<< 11792 1727096184.52073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/AnsiballZ_network_connections.py" <<< 11792 1727096184.52095: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp_0ag3flj" to remote "/root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/AnsiballZ_network_connections.py" <<< 11792 1727096184.52097: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/AnsiballZ_network_connections.py" <<< 11792 1727096184.52981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096184.52985: stderr chunk (state=3): >>><<< 11792 1727096184.52988: stdout chunk (state=3): >>><<< 11792 1727096184.52990: done transferring module to remote 11792 1727096184.52992: _low_level_execute_command(): starting 11792 1727096184.52994: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/ /root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/AnsiballZ_network_connections.py && sleep 0' 11792 1727096184.53540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096184.53582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096184.53585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096184.53587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096184.53589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096184.53660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096184.53663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096184.53695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096184.55601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096184.55626: stderr chunk (state=3): >>><<< 11792 1727096184.55629: stdout chunk (state=3): >>><<< 11792 1727096184.55643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096184.55646: _low_level_execute_command(): starting 11792 1727096184.55651: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/AnsiballZ_network_connections.py && sleep 0' 11792 1727096184.56123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096184.56126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096184.56129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096184.56136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096184.56139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096184.56182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096184.56185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096184.56192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096184.56238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.13177: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11792 1727096185.13183: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/60ed1929-4403-4ab6-a959-1c7d3a8cd186: error=unknown <<< 11792 1727096185.14985: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11792 1727096185.14989: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/48cd831f-8976-4885-b9b3-6c2ccae54189: error=unknown <<< 11792 1727096185.16856: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11792 1727096185.16883: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/904f86c6-f9de-41d4-8310-1428b04dc202: error=unknown <<< 11792 1727096185.17105: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11792 1727096185.19112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096185.19138: stderr chunk (state=3): >>><<< 11792 1727096185.19141: stdout chunk (state=3): >>><<< 11792 1727096185.19165: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/60ed1929-4403-4ab6-a959-1c7d3a8cd186: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/48cd831f-8976-4885-b9b3-6c2ccae54189: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mri0_rk1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/904f86c6-f9de-41d4-8310-1428b04dc202: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096185.19203: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096185.19210: _low_level_execute_command(): starting 11792 1727096185.19215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096184.4568796-14799-119891755639737/ > /dev/null 2>&1 && sleep 0' 11792 1727096185.19660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096185.19691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096185.19696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.19698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096185.19700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.19751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096185.19754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096185.19762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.19800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.21766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096185.21792: stderr chunk (state=3): >>><<< 11792 1727096185.21795: stdout chunk (state=3): >>><<< 11792 1727096185.21810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096185.21817: handler run complete 11792 1727096185.21841: attempt loop complete, returning result 11792 1727096185.21844: _execute() done 11792 1727096185.21846: dumping result to json 11792 1727096185.21853: done dumping result, returning 11792 1727096185.21863: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-d9c7-3fc0-000000000e1a] 11792 1727096185.21865: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1a 11792 1727096185.21984: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1a 11792 1727096185.21987: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11792 1727096185.22092: no more pending results, returning what we have 11792 1727096185.22096: results queue empty 11792 1727096185.22096: checking for any_errors_fatal 11792 1727096185.22105: done checking for any_errors_fatal 11792 1727096185.22106: checking for max_fail_percentage 11792 1727096185.22107: done checking for max_fail_percentage 11792 1727096185.22108: checking to see if all hosts have failed and the running result is not ok 11792 1727096185.22109: done checking to see if all hosts have failed 11792 1727096185.22110: getting the remaining hosts for this loop 11792 1727096185.22111: done getting the remaining hosts for this loop 11792 1727096185.22115: getting the next task for host managed_node2 11792 1727096185.22123: done getting next task for host managed_node2 11792 1727096185.22126: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11792 1727096185.22132: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096185.22144: getting variables 11792 1727096185.22145: in VariableManager get_vars() 11792 1727096185.22203: Calling all_inventory to load vars for managed_node2 11792 1727096185.22206: Calling groups_inventory to load vars for managed_node2 11792 1727096185.22208: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096185.22218: Calling all_plugins_play to load vars for managed_node2 11792 1727096185.22220: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096185.22223: Calling groups_plugins_play to load vars for managed_node2 11792 1727096185.23051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096185.24070: done with get_vars() 11792 1727096185.24089: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:56:25 -0400 (0:00:00.961) 0:01:07.520 ****** 11792 1727096185.24162: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11792 1727096185.24434: worker is 1 (out of 1 available) 11792 1727096185.24448: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11792 1727096185.24466: done queuing things up, now waiting for results queue to drain 11792 1727096185.24469: waiting for pending results... 11792 1727096185.24659: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11792 1727096185.24767: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e1b 11792 1727096185.24781: variable 'ansible_search_path' from source: unknown 11792 1727096185.24785: variable 'ansible_search_path' from source: unknown 11792 1727096185.24817: calling self._execute() 11792 1727096185.24897: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.24901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.24915: variable 'omit' from source: magic vars 11792 1727096185.25198: variable 'ansible_distribution_major_version' from source: facts 11792 1727096185.25207: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096185.25296: variable 'network_state' from source: role '' defaults 11792 1727096185.25306: Evaluated conditional (network_state != {}): False 11792 1727096185.25309: when evaluation is False, skipping this task 11792 1727096185.25312: _execute() done 11792 1727096185.25315: dumping result to json 11792 1727096185.25317: done dumping result, returning 11792 1727096185.25324: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-d9c7-3fc0-000000000e1b] 11792 1727096185.25329: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1b 11792 1727096185.25422: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1b 11792 1727096185.25425: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11792 1727096185.25507: no more pending results, returning what we have 11792 1727096185.25511: results queue empty 11792 1727096185.25512: checking for any_errors_fatal 11792 1727096185.25523: done checking for any_errors_fatal 11792 1727096185.25524: checking for max_fail_percentage 11792 1727096185.25526: done checking for max_fail_percentage 11792 1727096185.25527: checking to see if all hosts have failed and the running result is not ok 11792 1727096185.25528: done checking to see if all hosts have failed 11792 1727096185.25528: getting the remaining hosts for this loop 11792 1727096185.25530: done getting the remaining hosts for this loop 11792 1727096185.25533: getting the next task for host managed_node2 11792 1727096185.25541: done getting next task for host managed_node2 11792 1727096185.25545: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11792 1727096185.25550: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096185.25573: getting variables 11792 1727096185.25575: in VariableManager get_vars() 11792 1727096185.25614: Calling all_inventory to load vars for managed_node2 11792 1727096185.25616: Calling groups_inventory to load vars for managed_node2 11792 1727096185.25618: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096185.25626: Calling all_plugins_play to load vars for managed_node2 11792 1727096185.25629: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096185.25631: Calling groups_plugins_play to load vars for managed_node2 11792 1727096185.26441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096185.27336: done with get_vars() 11792 1727096185.27363: done getting variables 11792 1727096185.27414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:56:25 -0400 (0:00:00.032) 0:01:07.553 ****** 11792 1727096185.27442: entering _queue_task() for managed_node2/debug 11792 1727096185.27717: worker is 1 (out of 1 available) 11792 1727096185.27732: exiting _queue_task() for managed_node2/debug 11792 1727096185.27746: done queuing things up, now waiting for results queue to drain 11792 1727096185.27748: waiting for pending results... 11792 1727096185.27942: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11792 1727096185.28044: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e1c 11792 1727096185.28060: variable 'ansible_search_path' from source: unknown 11792 1727096185.28064: variable 'ansible_search_path' from source: unknown 11792 1727096185.28096: calling self._execute() 11792 1727096185.28174: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.28178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.28192: variable 'omit' from source: magic vars 11792 1727096185.28598: variable 'ansible_distribution_major_version' from source: facts 11792 1727096185.28602: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096185.28605: variable 'omit' from source: magic vars 11792 1727096185.28607: variable 'omit' from source: magic vars 11792 1727096185.28629: variable 'omit' from source: magic vars 11792 1727096185.28664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096185.28697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096185.28714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096185.28729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096185.28738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096185.28764: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096185.28767: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.28775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.28843: Set connection var ansible_timeout to 10 11792 1727096185.28850: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096185.28860: Set connection var ansible_shell_executable to /bin/sh 11792 1727096185.28866: Set connection var ansible_pipelining to False 11792 1727096185.28870: Set connection var ansible_shell_type to sh 11792 1727096185.28872: Set connection var ansible_connection to ssh 11792 1727096185.28893: variable 'ansible_shell_executable' from source: unknown 11792 1727096185.28896: variable 'ansible_connection' from source: unknown 11792 1727096185.28899: variable 'ansible_module_compression' from source: unknown 11792 1727096185.28901: variable 'ansible_shell_type' from source: unknown 11792 1727096185.28905: variable 'ansible_shell_executable' from source: unknown 11792 1727096185.28907: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.28912: variable 'ansible_pipelining' from source: unknown 11792 1727096185.28914: variable 'ansible_timeout' from source: unknown 11792 1727096185.28918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.29026: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096185.29036: variable 'omit' from source: magic vars 11792 1727096185.29042: starting attempt loop 11792 1727096185.29045: running the handler 11792 1727096185.29148: variable '__network_connections_result' from source: set_fact 11792 1727096185.29195: handler run complete 11792 1727096185.29213: attempt loop complete, returning result 11792 1727096185.29216: _execute() done 11792 1727096185.29218: dumping result to json 11792 1727096185.29221: done dumping result, returning 11792 1727096185.29227: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-d9c7-3fc0-000000000e1c] 11792 1727096185.29230: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1c 11792 1727096185.29321: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1c 11792 1727096185.29324: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 11792 1727096185.29391: no more pending results, returning what we have 11792 1727096185.29394: results queue empty 11792 1727096185.29395: checking for any_errors_fatal 11792 1727096185.29404: done checking for any_errors_fatal 11792 1727096185.29405: checking for max_fail_percentage 11792 1727096185.29407: done checking for max_fail_percentage 11792 1727096185.29407: checking to see if all hosts have failed and the running result is not ok 11792 1727096185.29408: done checking to see if all hosts have failed 11792 1727096185.29409: getting the remaining hosts for this loop 11792 1727096185.29410: done getting the remaining hosts for this loop 11792 1727096185.29413: getting the next task for host managed_node2 11792 1727096185.29423: done getting next task for host managed_node2 11792 1727096185.29426: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11792 1727096185.29432: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096185.29445: getting variables 11792 1727096185.29446: in VariableManager get_vars() 11792 1727096185.29494: Calling all_inventory to load vars for managed_node2 11792 1727096185.29497: Calling groups_inventory to load vars for managed_node2 11792 1727096185.29499: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096185.29508: Calling all_plugins_play to load vars for managed_node2 11792 1727096185.29511: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096185.29513: Calling groups_plugins_play to load vars for managed_node2 11792 1727096185.30639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096185.32204: done with get_vars() 11792 1727096185.32238: done getting variables 11792 1727096185.32286: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:56:25 -0400 (0:00:00.048) 0:01:07.602 ****** 11792 1727096185.32319: entering _queue_task() for managed_node2/debug 11792 1727096185.32591: worker is 1 (out of 1 available) 11792 1727096185.32605: exiting _queue_task() for managed_node2/debug 11792 1727096185.32618: done queuing things up, now waiting for results queue to drain 11792 1727096185.32620: waiting for pending results... 11792 1727096185.32823: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11792 1727096185.32935: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e1d 11792 1727096185.32948: variable 'ansible_search_path' from source: unknown 11792 1727096185.32953: variable 'ansible_search_path' from source: unknown 11792 1727096185.32985: calling self._execute() 11792 1727096185.33089: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.33093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.33095: variable 'omit' from source: magic vars 11792 1727096185.33370: variable 'ansible_distribution_major_version' from source: facts 11792 1727096185.33381: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096185.33387: variable 'omit' from source: magic vars 11792 1727096185.33440: variable 'omit' from source: magic vars 11792 1727096185.33464: variable 'omit' from source: magic vars 11792 1727096185.33499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096185.33528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096185.33544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096185.33560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096185.33569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096185.33592: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096185.33595: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.33598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.33670: Set connection var ansible_timeout to 10 11792 1727096185.33676: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096185.33684: Set connection var ansible_shell_executable to /bin/sh 11792 1727096185.33688: Set connection var ansible_pipelining to False 11792 1727096185.33691: Set connection var ansible_shell_type to sh 11792 1727096185.33693: Set connection var ansible_connection to ssh 11792 1727096185.33710: variable 'ansible_shell_executable' from source: unknown 11792 1727096185.33713: variable 'ansible_connection' from source: unknown 11792 1727096185.33716: variable 'ansible_module_compression' from source: unknown 11792 1727096185.33718: variable 'ansible_shell_type' from source: unknown 11792 1727096185.33720: variable 'ansible_shell_executable' from source: unknown 11792 1727096185.33722: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.33725: variable 'ansible_pipelining' from source: unknown 11792 1727096185.33728: variable 'ansible_timeout' from source: unknown 11792 1727096185.33739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.33836: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096185.33849: variable 'omit' from source: magic vars 11792 1727096185.33852: starting attempt loop 11792 1727096185.33857: running the handler 11792 1727096185.33897: variable '__network_connections_result' from source: set_fact 11792 1727096185.33960: variable '__network_connections_result' from source: set_fact 11792 1727096185.34172: handler run complete 11792 1727096185.34175: attempt loop complete, returning result 11792 1727096185.34178: _execute() done 11792 1727096185.34180: dumping result to json 11792 1727096185.34182: done dumping result, returning 11792 1727096185.34184: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-d9c7-3fc0-000000000e1d] 11792 1727096185.34186: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1d 11792 1727096185.34259: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1d 11792 1727096185.34263: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11792 1727096185.34460: no more pending results, returning what we have 11792 1727096185.34465: results queue empty 11792 1727096185.34466: checking for any_errors_fatal 11792 1727096185.34474: done checking for any_errors_fatal 11792 1727096185.34475: checking for max_fail_percentage 11792 1727096185.34477: done checking for max_fail_percentage 11792 1727096185.34480: checking to see if all hosts have failed and the running result is not ok 11792 1727096185.34481: done checking to see if all hosts have failed 11792 1727096185.34481: getting the remaining hosts for this loop 11792 1727096185.34483: done getting the remaining hosts for this loop 11792 1727096185.34486: getting the next task for host managed_node2 11792 1727096185.34493: done getting next task for host managed_node2 11792 1727096185.34496: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11792 1727096185.34500: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096185.34512: getting variables 11792 1727096185.34513: in VariableManager get_vars() 11792 1727096185.34551: Calling all_inventory to load vars for managed_node2 11792 1727096185.34563: Calling groups_inventory to load vars for managed_node2 11792 1727096185.34566: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096185.34726: Calling all_plugins_play to load vars for managed_node2 11792 1727096185.34729: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096185.34732: Calling groups_plugins_play to load vars for managed_node2 11792 1727096185.36453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096185.38190: done with get_vars() 11792 1727096185.38222: done getting variables 11792 1727096185.38286: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:56:25 -0400 (0:00:00.060) 0:01:07.662 ****** 11792 1727096185.38324: entering _queue_task() for managed_node2/debug 11792 1727096185.39001: worker is 1 (out of 1 available) 11792 1727096185.39011: exiting _queue_task() for managed_node2/debug 11792 1727096185.39022: done queuing things up, now waiting for results queue to drain 11792 1727096185.39023: waiting for pending results... 11792 1727096185.39103: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11792 1727096185.39276: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e1e 11792 1727096185.39300: variable 'ansible_search_path' from source: unknown 11792 1727096185.39357: variable 'ansible_search_path' from source: unknown 11792 1727096185.39362: calling self._execute() 11792 1727096185.39454: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.39474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.39487: variable 'omit' from source: magic vars 11792 1727096185.39849: variable 'ansible_distribution_major_version' from source: facts 11792 1727096185.39866: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096185.39993: variable 'network_state' from source: role '' defaults 11792 1727096185.40015: Evaluated conditional (network_state != {}): False 11792 1727096185.40072: when evaluation is False, skipping this task 11792 1727096185.40075: _execute() done 11792 1727096185.40078: dumping result to json 11792 1727096185.40080: done dumping result, returning 11792 1727096185.40083: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-d9c7-3fc0-000000000e1e] 11792 1727096185.40085: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1e 11792 1727096185.40373: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1e 11792 1727096185.40377: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11792 1727096185.40416: no more pending results, returning what we have 11792 1727096185.40419: results queue empty 11792 1727096185.40420: checking for any_errors_fatal 11792 1727096185.40428: done checking for any_errors_fatal 11792 1727096185.40429: checking for max_fail_percentage 11792 1727096185.40431: done checking for max_fail_percentage 11792 1727096185.40431: checking to see if all hosts have failed and the running result is not ok 11792 1727096185.40432: done checking to see if all hosts have failed 11792 1727096185.40433: getting the remaining hosts for this loop 11792 1727096185.40434: done getting the remaining hosts for this loop 11792 1727096185.40437: getting the next task for host managed_node2 11792 1727096185.40445: done getting next task for host managed_node2 11792 1727096185.40448: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11792 1727096185.40453: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096185.40474: getting variables 11792 1727096185.40476: in VariableManager get_vars() 11792 1727096185.40518: Calling all_inventory to load vars for managed_node2 11792 1727096185.40521: Calling groups_inventory to load vars for managed_node2 11792 1727096185.40523: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096185.40533: Calling all_plugins_play to load vars for managed_node2 11792 1727096185.40536: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096185.40538: Calling groups_plugins_play to load vars for managed_node2 11792 1727096185.43200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096185.46404: done with get_vars() 11792 1727096185.46442: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:56:25 -0400 (0:00:00.084) 0:01:07.746 ****** 11792 1727096185.46763: entering _queue_task() for managed_node2/ping 11792 1727096185.47650: worker is 1 (out of 1 available) 11792 1727096185.47663: exiting _queue_task() for managed_node2/ping 11792 1727096185.47685: done queuing things up, now waiting for results queue to drain 11792 1727096185.47687: waiting for pending results... 11792 1727096185.48401: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11792 1727096185.49051: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e1f 11792 1727096185.49275: variable 'ansible_search_path' from source: unknown 11792 1727096185.49280: variable 'ansible_search_path' from source: unknown 11792 1727096185.49282: calling self._execute() 11792 1727096185.49447: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.49464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.49673: variable 'omit' from source: magic vars 11792 1727096185.50474: variable 'ansible_distribution_major_version' from source: facts 11792 1727096185.50479: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096185.50482: variable 'omit' from source: magic vars 11792 1727096185.50874: variable 'omit' from source: magic vars 11792 1727096185.50878: variable 'omit' from source: magic vars 11792 1727096185.50881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096185.50883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096185.50885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096185.50887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096185.50889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096185.51273: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096185.51277: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.51280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.51282: Set connection var ansible_timeout to 10 11792 1727096185.51284: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096185.51286: Set connection var ansible_shell_executable to /bin/sh 11792 1727096185.51288: Set connection var ansible_pipelining to False 11792 1727096185.51290: Set connection var ansible_shell_type to sh 11792 1727096185.51292: Set connection var ansible_connection to ssh 11792 1727096185.51294: variable 'ansible_shell_executable' from source: unknown 11792 1727096185.51296: variable 'ansible_connection' from source: unknown 11792 1727096185.51479: variable 'ansible_module_compression' from source: unknown 11792 1727096185.51489: variable 'ansible_shell_type' from source: unknown 11792 1727096185.51497: variable 'ansible_shell_executable' from source: unknown 11792 1727096185.51505: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.51514: variable 'ansible_pipelining' from source: unknown 11792 1727096185.51521: variable 'ansible_timeout' from source: unknown 11792 1727096185.51529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.51962: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11792 1727096185.51982: variable 'omit' from source: magic vars 11792 1727096185.51992: starting attempt loop 11792 1727096185.51998: running the handler 11792 1727096185.52016: _low_level_execute_command(): starting 11792 1727096185.52028: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096185.53600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.53801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096185.53828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.53904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.55619: stdout chunk (state=3): >>>/root <<< 11792 1727096185.55763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096185.55779: stdout chunk (state=3): >>><<< 11792 1727096185.55791: stderr chunk (state=3): >>><<< 11792 1727096185.55817: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096185.55849: _low_level_execute_command(): starting 11792 1727096185.55865: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818 `" && echo ansible-tmp-1727096185.55835-14844-181128653555818="` echo /root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818 `" ) && sleep 0' 11792 1727096185.56540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096185.56558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096185.56578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096185.56598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096185.56699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096185.56728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.56798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.58872: stdout chunk (state=3): >>>ansible-tmp-1727096185.55835-14844-181128653555818=/root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818 <<< 11792 1727096185.58962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096185.59055: stderr chunk (state=3): >>><<< 11792 1727096185.59092: stdout chunk (state=3): >>><<< 11792 1727096185.59123: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096185.55835-14844-181128653555818=/root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096185.59187: variable 'ansible_module_compression' from source: unknown 11792 1727096185.59259: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11792 1727096185.59288: variable 'ansible_facts' from source: unknown 11792 1727096185.59374: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/AnsiballZ_ping.py 11792 1727096185.59609: Sending initial data 11792 1727096185.59616: Sent initial data (151 bytes) 11792 1727096185.60180: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.60233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096185.60271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096185.60291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.60354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.62401: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096185.62410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096185.62449: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmphzxk7r0p /root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/AnsiballZ_ping.py <<< 11792 1727096185.62452: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/AnsiballZ_ping.py" <<< 11792 1727096185.62458: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmphzxk7r0p" to remote "/root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/AnsiballZ_ping.py" <<< 11792 1727096185.63450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096185.63505: stderr chunk (state=3): >>><<< 11792 1727096185.63514: stdout chunk (state=3): >>><<< 11792 1727096185.63542: done transferring module to remote 11792 1727096185.63561: _low_level_execute_command(): starting 11792 1727096185.63574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/ /root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/AnsiballZ_ping.py && sleep 0' 11792 1727096185.64187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096185.64286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.64313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096185.64340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096185.64354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.64432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.66370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096185.66411: stdout chunk (state=3): >>><<< 11792 1727096185.66478: stderr chunk (state=3): >>><<< 11792 1727096185.66677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096185.66686: _low_level_execute_command(): starting 11792 1727096185.66688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/AnsiballZ_ping.py && sleep 0' 11792 1727096185.67603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096185.67607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096185.67628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096185.67634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.67654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096185.67727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096185.67731: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.67836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.83572: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11792 1727096185.84905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096185.84925: stderr chunk (state=3): >>><<< 11792 1727096185.84928: stdout chunk (state=3): >>><<< 11792 1727096185.84944: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096185.84965: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096185.84975: _low_level_execute_command(): starting 11792 1727096185.84980: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096185.55835-14844-181128653555818/ > /dev/null 2>&1 && sleep 0' 11792 1727096185.85441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096185.85444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.85447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096185.85449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.85503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096185.85506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096185.85510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.85548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.87432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096185.87457: stderr chunk (state=3): >>><<< 11792 1727096185.87461: stdout chunk (state=3): >>><<< 11792 1727096185.87484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096185.87491: handler run complete 11792 1727096185.87503: attempt loop complete, returning result 11792 1727096185.87506: _execute() done 11792 1727096185.87508: dumping result to json 11792 1727096185.87510: done dumping result, returning 11792 1727096185.87519: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-d9c7-3fc0-000000000e1f] 11792 1727096185.87523: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1f 11792 1727096185.87614: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e1f 11792 1727096185.87617: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11792 1727096185.87681: no more pending results, returning what we have 11792 1727096185.87685: results queue empty 11792 1727096185.87685: checking for any_errors_fatal 11792 1727096185.87692: done checking for any_errors_fatal 11792 1727096185.87692: checking for max_fail_percentage 11792 1727096185.87694: done checking for max_fail_percentage 11792 1727096185.87695: checking to see if all hosts have failed and the running result is not ok 11792 1727096185.87696: done checking to see if all hosts have failed 11792 1727096185.87696: getting the remaining hosts for this loop 11792 1727096185.87698: done getting the remaining hosts for this loop 11792 1727096185.87701: getting the next task for host managed_node2 11792 1727096185.87711: done getting next task for host managed_node2 11792 1727096185.87712: ^ task is: TASK: meta (role_complete) 11792 1727096185.87718: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096185.87729: getting variables 11792 1727096185.87730: in VariableManager get_vars() 11792 1727096185.87780: Calling all_inventory to load vars for managed_node2 11792 1727096185.87783: Calling groups_inventory to load vars for managed_node2 11792 1727096185.87785: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096185.87795: Calling all_plugins_play to load vars for managed_node2 11792 1727096185.87798: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096185.87800: Calling groups_plugins_play to load vars for managed_node2 11792 1727096185.88616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096185.89599: done with get_vars() 11792 1727096185.89618: done getting variables 11792 1727096185.89681: done queuing things up, now waiting for results queue to drain 11792 1727096185.89683: results queue empty 11792 1727096185.89683: checking for any_errors_fatal 11792 1727096185.89685: done checking for any_errors_fatal 11792 1727096185.89686: checking for max_fail_percentage 11792 1727096185.89686: done checking for max_fail_percentage 11792 1727096185.89687: checking to see if all hosts have failed and the running result is not ok 11792 1727096185.89687: done checking to see if all hosts have failed 11792 1727096185.89688: getting the remaining hosts for this loop 11792 1727096185.89688: done getting the remaining hosts for this loop 11792 1727096185.89690: getting the next task for host managed_node2 11792 1727096185.89693: done getting next task for host managed_node2 11792 1727096185.89695: ^ task is: TASK: Delete the device '{{ controller_device }}' 11792 1727096185.89696: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096185.89698: getting variables 11792 1727096185.89699: in VariableManager get_vars() 11792 1727096185.89710: Calling all_inventory to load vars for managed_node2 11792 1727096185.89712: Calling groups_inventory to load vars for managed_node2 11792 1727096185.89714: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096185.89718: Calling all_plugins_play to load vars for managed_node2 11792 1727096185.89720: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096185.89722: Calling groups_plugins_play to load vars for managed_node2 11792 1727096185.90349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096185.91202: done with get_vars() 11792 1727096185.91219: done getting variables 11792 1727096185.91253: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11792 1727096185.91344: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Monday 23 September 2024 08:56:25 -0400 (0:00:00.446) 0:01:08.192 ****** 11792 1727096185.91371: entering _queue_task() for managed_node2/command 11792 1727096185.91643: worker is 1 (out of 1 available) 11792 1727096185.91656: exiting _queue_task() for managed_node2/command 11792 1727096185.91671: done queuing things up, now waiting for results queue to drain 11792 1727096185.91674: waiting for pending results... 11792 1727096185.91861: running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' 11792 1727096185.91945: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e4f 11792 1727096185.91957: variable 'ansible_search_path' from source: unknown 11792 1727096185.91961: variable 'ansible_search_path' from source: unknown 11792 1727096185.91992: calling self._execute() 11792 1727096185.92076: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.92080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.92089: variable 'omit' from source: magic vars 11792 1727096185.92372: variable 'ansible_distribution_major_version' from source: facts 11792 1727096185.92382: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096185.92389: variable 'omit' from source: magic vars 11792 1727096185.92404: variable 'omit' from source: magic vars 11792 1727096185.92475: variable 'controller_device' from source: play vars 11792 1727096185.92490: variable 'omit' from source: magic vars 11792 1727096185.92523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096185.92551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096185.92572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096185.92586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096185.92594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096185.92617: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096185.92620: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.92623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.92698: Set connection var ansible_timeout to 10 11792 1727096185.92705: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096185.92712: Set connection var ansible_shell_executable to /bin/sh 11792 1727096185.92718: Set connection var ansible_pipelining to False 11792 1727096185.92720: Set connection var ansible_shell_type to sh 11792 1727096185.92723: Set connection var ansible_connection to ssh 11792 1727096185.92740: variable 'ansible_shell_executable' from source: unknown 11792 1727096185.92743: variable 'ansible_connection' from source: unknown 11792 1727096185.92745: variable 'ansible_module_compression' from source: unknown 11792 1727096185.92748: variable 'ansible_shell_type' from source: unknown 11792 1727096185.92750: variable 'ansible_shell_executable' from source: unknown 11792 1727096185.92752: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096185.92757: variable 'ansible_pipelining' from source: unknown 11792 1727096185.92762: variable 'ansible_timeout' from source: unknown 11792 1727096185.92766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096185.92869: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096185.92880: variable 'omit' from source: magic vars 11792 1727096185.92884: starting attempt loop 11792 1727096185.92887: running the handler 11792 1727096185.92901: _low_level_execute_command(): starting 11792 1727096185.92908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096185.93423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096185.93426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.93430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096185.93435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.93490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096185.93493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096185.93496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.93537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.95226: stdout chunk (state=3): >>>/root <<< 11792 1727096185.95317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096185.95348: stderr chunk (state=3): >>><<< 11792 1727096185.95351: stdout chunk (state=3): >>><<< 11792 1727096185.95380: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096185.95396: _low_level_execute_command(): starting 11792 1727096185.95402: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933 `" && echo ansible-tmp-1727096185.9538221-14868-134886718037933="` echo /root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933 `" ) && sleep 0' 11792 1727096185.95863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096185.95867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096185.95879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096185.95882: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096185.95885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.95927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096185.95936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096185.95940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.95973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096185.97958: stdout chunk (state=3): >>>ansible-tmp-1727096185.9538221-14868-134886718037933=/root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933 <<< 11792 1727096185.98057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096185.98092: stderr chunk (state=3): >>><<< 11792 1727096185.98096: stdout chunk (state=3): >>><<< 11792 1727096185.98112: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096185.9538221-14868-134886718037933=/root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096185.98140: variable 'ansible_module_compression' from source: unknown 11792 1727096185.98185: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096185.98218: variable 'ansible_facts' from source: unknown 11792 1727096185.98276: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/AnsiballZ_command.py 11792 1727096185.98380: Sending initial data 11792 1727096185.98384: Sent initial data (156 bytes) 11792 1727096185.98843: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096185.98847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.98850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096185.98852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096185.98854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096185.98905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096185.98908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096185.98953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.00620: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096186.00644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096186.00679: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpcx__5d0h /root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/AnsiballZ_command.py <<< 11792 1727096186.00693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/AnsiballZ_command.py" <<< 11792 1727096186.00715: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpcx__5d0h" to remote "/root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/AnsiballZ_command.py" <<< 11792 1727096186.00718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/AnsiballZ_command.py" <<< 11792 1727096186.01203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.01250: stderr chunk (state=3): >>><<< 11792 1727096186.01253: stdout chunk (state=3): >>><<< 11792 1727096186.01297: done transferring module to remote 11792 1727096186.01306: _low_level_execute_command(): starting 11792 1727096186.01311: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/ /root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/AnsiballZ_command.py && sleep 0' 11792 1727096186.01764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.01770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.01773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.01783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.01833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.01837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.01839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.01879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.03781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.03803: stderr chunk (state=3): >>><<< 11792 1727096186.03806: stdout chunk (state=3): >>><<< 11792 1727096186.03821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.03824: _low_level_execute_command(): starting 11792 1727096186.03835: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/AnsiballZ_command.py && sleep 0' 11792 1727096186.04299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.04303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096186.04305: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096186.04307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.04354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.04364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.04366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.04405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.20920: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-23 08:56:26.200312", "end": "2024-09-23 08:56:26.207738", "delta": "0:00:00.007426", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096186.22413: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.126 closed. <<< 11792 1727096186.22442: stderr chunk (state=3): >>><<< 11792 1727096186.22445: stdout chunk (state=3): >>><<< 11792 1727096186.22468: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-23 08:56:26.200312", "end": "2024-09-23 08:56:26.207738", "delta": "0:00:00.007426", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.126 closed. 11792 1727096186.22498: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096186.22504: _low_level_execute_command(): starting 11792 1727096186.22509: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096185.9538221-14868-134886718037933/ > /dev/null 2>&1 && sleep 0' 11792 1727096186.22966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096186.22971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096186.22978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11792 1727096186.22980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.22982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.23066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.23100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.25064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.25275: stderr chunk (state=3): >>><<< 11792 1727096186.25278: stdout chunk (state=3): >>><<< 11792 1727096186.25281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.25283: handler run complete 11792 1727096186.25285: Evaluated conditional (False): False 11792 1727096186.25287: Evaluated conditional (False): False 11792 1727096186.25289: attempt loop complete, returning result 11792 1727096186.25291: _execute() done 11792 1727096186.25293: dumping result to json 11792 1727096186.25295: done dumping result, returning 11792 1727096186.25296: done running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' [0afff68d-5257-d9c7-3fc0-000000000e4f] 11792 1727096186.25298: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e4f 11792 1727096186.25381: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e4f 11792 1727096186.25385: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007426", "end": "2024-09-23 08:56:26.207738", "failed_when_result": false, "rc": 1, "start": "2024-09-23 08:56:26.200312" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11792 1727096186.25458: no more pending results, returning what we have 11792 1727096186.25462: results queue empty 11792 1727096186.25463: checking for any_errors_fatal 11792 1727096186.25466: done checking for any_errors_fatal 11792 1727096186.25466: checking for max_fail_percentage 11792 1727096186.25470: done checking for max_fail_percentage 11792 1727096186.25471: checking to see if all hosts have failed and the running result is not ok 11792 1727096186.25472: done checking to see if all hosts have failed 11792 1727096186.25473: getting the remaining hosts for this loop 11792 1727096186.25474: done getting the remaining hosts for this loop 11792 1727096186.25478: getting the next task for host managed_node2 11792 1727096186.25492: done getting next task for host managed_node2 11792 1727096186.25498: ^ task is: TASK: Remove test interfaces 11792 1727096186.25503: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096186.25508: getting variables 11792 1727096186.25510: in VariableManager get_vars() 11792 1727096186.25562: Calling all_inventory to load vars for managed_node2 11792 1727096186.25565: Calling groups_inventory to load vars for managed_node2 11792 1727096186.25680: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096186.25695: Calling all_plugins_play to load vars for managed_node2 11792 1727096186.25698: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096186.25701: Calling groups_plugins_play to load vars for managed_node2 11792 1727096186.27622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096186.29089: done with get_vars() 11792 1727096186.29117: done getting variables 11792 1727096186.29182: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:56:26 -0400 (0:00:00.378) 0:01:08.571 ****** 11792 1727096186.29215: entering _queue_task() for managed_node2/shell 11792 1727096186.29574: worker is 1 (out of 1 available) 11792 1727096186.29587: exiting _queue_task() for managed_node2/shell 11792 1727096186.29600: done queuing things up, now waiting for results queue to drain 11792 1727096186.29601: waiting for pending results... 11792 1727096186.29949: running TaskExecutor() for managed_node2/TASK: Remove test interfaces 11792 1727096186.30046: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e55 11792 1727096186.30050: variable 'ansible_search_path' from source: unknown 11792 1727096186.30053: variable 'ansible_search_path' from source: unknown 11792 1727096186.30056: calling self._execute() 11792 1727096186.30166: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096186.30175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096186.30185: variable 'omit' from source: magic vars 11792 1727096186.30591: variable 'ansible_distribution_major_version' from source: facts 11792 1727096186.30594: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096186.30597: variable 'omit' from source: magic vars 11792 1727096186.30643: variable 'omit' from source: magic vars 11792 1727096186.30808: variable 'dhcp_interface1' from source: play vars 11792 1727096186.30812: variable 'dhcp_interface2' from source: play vars 11792 1727096186.30831: variable 'omit' from source: magic vars 11792 1727096186.30879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096186.30917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096186.30934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096186.30955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096186.30970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096186.30997: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096186.31000: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096186.31003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096186.31109: Set connection var ansible_timeout to 10 11792 1727096186.31117: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096186.31126: Set connection var ansible_shell_executable to /bin/sh 11792 1727096186.31133: Set connection var ansible_pipelining to False 11792 1727096186.31136: Set connection var ansible_shell_type to sh 11792 1727096186.31138: Set connection var ansible_connection to ssh 11792 1727096186.31160: variable 'ansible_shell_executable' from source: unknown 11792 1727096186.31163: variable 'ansible_connection' from source: unknown 11792 1727096186.31165: variable 'ansible_module_compression' from source: unknown 11792 1727096186.31174: variable 'ansible_shell_type' from source: unknown 11792 1727096186.31177: variable 'ansible_shell_executable' from source: unknown 11792 1727096186.31179: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096186.31183: variable 'ansible_pipelining' from source: unknown 11792 1727096186.31186: variable 'ansible_timeout' from source: unknown 11792 1727096186.31190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096186.31331: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096186.31351: variable 'omit' from source: magic vars 11792 1727096186.31354: starting attempt loop 11792 1727096186.31356: running the handler 11792 1727096186.31359: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096186.31461: _low_level_execute_command(): starting 11792 1727096186.31464: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096186.32085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096186.32097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096186.32107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.32123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096186.32135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096186.32142: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096186.32152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.32172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096186.32179: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096186.32186: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11792 1727096186.32194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096186.32204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.32216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096186.32223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096186.32285: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.32310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.32331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.32340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.32410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.34160: stdout chunk (state=3): >>>/root <<< 11792 1727096186.34345: stdout chunk (state=3): >>><<< 11792 1727096186.34356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.34366: stderr chunk (state=3): >>><<< 11792 1727096186.34394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.34420: _low_level_execute_command(): starting 11792 1727096186.34431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598 `" && echo ansible-tmp-1727096186.3440728-14877-230863254749598="` echo /root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598 `" ) && sleep 0' 11792 1727096186.35043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096186.35057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096186.35074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.35092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096186.35116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096186.35213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.35230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.35298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.37275: stdout chunk (state=3): >>>ansible-tmp-1727096186.3440728-14877-230863254749598=/root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598 <<< 11792 1727096186.37420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.37431: stdout chunk (state=3): >>><<< 11792 1727096186.37448: stderr chunk (state=3): >>><<< 11792 1727096186.37474: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096186.3440728-14877-230863254749598=/root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.37548: variable 'ansible_module_compression' from source: unknown 11792 1727096186.37576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096186.37616: variable 'ansible_facts' from source: unknown 11792 1727096186.37710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/AnsiballZ_command.py 11792 1727096186.37895: Sending initial data 11792 1727096186.37897: Sent initial data (156 bytes) 11792 1727096186.38475: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096186.38585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.38610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.38677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.40277: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096186.40404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096186.40407: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp6hqstvyi /root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/AnsiballZ_command.py <<< 11792 1727096186.40409: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/AnsiballZ_command.py" <<< 11792 1727096186.40412: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp6hqstvyi" to remote "/root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/AnsiballZ_command.py" <<< 11792 1727096186.41214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.41314: stdout chunk (state=3): >>><<< 11792 1727096186.41317: stderr chunk (state=3): >>><<< 11792 1727096186.41320: done transferring module to remote 11792 1727096186.41341: _low_level_execute_command(): starting 11792 1727096186.41352: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/ /root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/AnsiballZ_command.py && sleep 0' 11792 1727096186.42078: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096186.42086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.42089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.42276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.42283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.42285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.44071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.44075: stdout chunk (state=3): >>><<< 11792 1727096186.44078: stderr chunk (state=3): >>><<< 11792 1727096186.44095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.44105: _low_level_execute_command(): starting 11792 1727096186.44176: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/AnsiballZ_command.py && sleep 0' 11792 1727096186.44774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096186.44844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.44911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.44942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.44964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.45055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.65575: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-23 08:56:26.605348", "end": "2024-09-23 08:56:26.650487", "delta": "0:00:00.045139", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096186.67000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.67062: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 11792 1727096186.67066: stdout chunk (state=3): >>><<< 11792 1727096186.67075: stderr chunk (state=3): >>><<< 11792 1727096186.67291: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-23 08:56:26.605348", "end": "2024-09-23 08:56:26.650487", "delta": "0:00:00.045139", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096186.67328: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096186.67335: _low_level_execute_command(): starting 11792 1727096186.67341: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096186.3440728-14877-230863254749598/ > /dev/null 2>&1 && sleep 0' 11792 1727096186.69073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096186.69077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 11792 1727096186.69173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.69176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.69178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.69180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.69315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.69344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.71250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.71263: stdout chunk (state=3): >>><<< 11792 1727096186.71278: stderr chunk (state=3): >>><<< 11792 1727096186.71573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.71577: handler run complete 11792 1727096186.71580: Evaluated conditional (False): False 11792 1727096186.71582: attempt loop complete, returning result 11792 1727096186.71584: _execute() done 11792 1727096186.71586: dumping result to json 11792 1727096186.71588: done dumping result, returning 11792 1727096186.71590: done running TaskExecutor() for managed_node2/TASK: Remove test interfaces [0afff68d-5257-d9c7-3fc0-000000000e55] 11792 1727096186.71592: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e55 11792 1727096186.71665: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e55 ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.045139", "end": "2024-09-23 08:56:26.650487", "rc": 0, "start": "2024-09-23 08:56:26.605348" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11792 1727096186.71737: no more pending results, returning what we have 11792 1727096186.71740: results queue empty 11792 1727096186.71741: checking for any_errors_fatal 11792 1727096186.71755: done checking for any_errors_fatal 11792 1727096186.71755: checking for max_fail_percentage 11792 1727096186.71757: done checking for max_fail_percentage 11792 1727096186.71759: checking to see if all hosts have failed and the running result is not ok 11792 1727096186.71759: done checking to see if all hosts have failed 11792 1727096186.71760: getting the remaining hosts for this loop 11792 1727096186.71762: done getting the remaining hosts for this loop 11792 1727096186.71765: getting the next task for host managed_node2 11792 1727096186.71775: done getting next task for host managed_node2 11792 1727096186.71778: ^ task is: TASK: Stop dnsmasq/radvd services 11792 1727096186.71782: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096186.71786: getting variables 11792 1727096186.71788: in VariableManager get_vars() 11792 1727096186.71840: Calling all_inventory to load vars for managed_node2 11792 1727096186.71844: Calling groups_inventory to load vars for managed_node2 11792 1727096186.71846: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096186.71859: Calling all_plugins_play to load vars for managed_node2 11792 1727096186.71862: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096186.71976: Calling groups_plugins_play to load vars for managed_node2 11792 1727096186.71871: WORKER PROCESS EXITING 11792 1727096186.74489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096186.77576: done with get_vars() 11792 1727096186.77615: done getting variables 11792 1727096186.77712: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Monday 23 September 2024 08:56:26 -0400 (0:00:00.485) 0:01:09.056 ****** 11792 1727096186.77746: entering _queue_task() for managed_node2/shell 11792 1727096186.78190: worker is 1 (out of 1 available) 11792 1727096186.78361: exiting _queue_task() for managed_node2/shell 11792 1727096186.78375: done queuing things up, now waiting for results queue to drain 11792 1727096186.78377: waiting for pending results... 11792 1727096186.78522: running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services 11792 1727096186.78662: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e56 11792 1727096186.78688: variable 'ansible_search_path' from source: unknown 11792 1727096186.78697: variable 'ansible_search_path' from source: unknown 11792 1727096186.78743: calling self._execute() 11792 1727096186.78858: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096186.78874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096186.78888: variable 'omit' from source: magic vars 11792 1727096186.79288: variable 'ansible_distribution_major_version' from source: facts 11792 1727096186.79307: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096186.79319: variable 'omit' from source: magic vars 11792 1727096186.79396: variable 'omit' from source: magic vars 11792 1727096186.79438: variable 'omit' from source: magic vars 11792 1727096186.79488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096186.79531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096186.79558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096186.79590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096186.79606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096186.79647: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096186.79657: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096186.79666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096186.79779: Set connection var ansible_timeout to 10 11792 1727096186.79796: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096186.79815: Set connection var ansible_shell_executable to /bin/sh 11792 1727096186.79826: Set connection var ansible_pipelining to False 11792 1727096186.79834: Set connection var ansible_shell_type to sh 11792 1727096186.79840: Set connection var ansible_connection to ssh 11792 1727096186.79869: variable 'ansible_shell_executable' from source: unknown 11792 1727096186.79880: variable 'ansible_connection' from source: unknown 11792 1727096186.79887: variable 'ansible_module_compression' from source: unknown 11792 1727096186.79894: variable 'ansible_shell_type' from source: unknown 11792 1727096186.79900: variable 'ansible_shell_executable' from source: unknown 11792 1727096186.79914: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096186.79923: variable 'ansible_pipelining' from source: unknown 11792 1727096186.79930: variable 'ansible_timeout' from source: unknown 11792 1727096186.79937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096186.80241: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096186.80288: variable 'omit' from source: magic vars 11792 1727096186.80351: starting attempt loop 11792 1727096186.80359: running the handler 11792 1727096186.80380: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096186.80406: _low_level_execute_command(): starting 11792 1727096186.80461: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096186.81563: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.81608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.81632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.81688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.83527: stdout chunk (state=3): >>>/root <<< 11792 1727096186.83606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.83610: stdout chunk (state=3): >>><<< 11792 1727096186.83621: stderr chunk (state=3): >>><<< 11792 1727096186.83640: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.83809: _low_level_execute_command(): starting 11792 1727096186.83816: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409 `" && echo ansible-tmp-1727096186.8371491-14902-121379686476409="` echo /root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409 `" ) && sleep 0' 11792 1727096186.84675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096186.84686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096186.84773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.84779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096186.84790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096186.84793: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.84847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.84863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.84874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.84940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.87078: stdout chunk (state=3): >>>ansible-tmp-1727096186.8371491-14902-121379686476409=/root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409 <<< 11792 1727096186.87082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.87084: stdout chunk (state=3): >>><<< 11792 1727096186.87086: stderr chunk (state=3): >>><<< 11792 1727096186.87105: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096186.8371491-14902-121379686476409=/root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.87197: variable 'ansible_module_compression' from source: unknown 11792 1727096186.87240: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096186.87290: variable 'ansible_facts' from source: unknown 11792 1727096186.87392: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/AnsiballZ_command.py 11792 1727096186.87718: Sending initial data 11792 1727096186.87820: Sent initial data (156 bytes) 11792 1727096186.88547: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096186.88563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096186.88591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.88607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096186.88623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096186.88694: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.88735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.88759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.88776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.88845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.90500: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096186.90769: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096186.90772: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp0iwjnr_j /root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/AnsiballZ_command.py <<< 11792 1727096186.90874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmp0iwjnr_j" to remote "/root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/AnsiballZ_command.py" <<< 11792 1727096186.92174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.92178: stderr chunk (state=3): >>><<< 11792 1727096186.92180: stdout chunk (state=3): >>><<< 11792 1727096186.92182: done transferring module to remote 11792 1727096186.92184: _low_level_execute_command(): starting 11792 1727096186.92187: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/ /root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/AnsiballZ_command.py && sleep 0' 11792 1727096186.92971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096186.92977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.92999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096186.93007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096186.93048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096186.93088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096186.93091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096186.93138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.93165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096186.95401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096186.95405: stdout chunk (state=3): >>><<< 11792 1727096186.95408: stderr chunk (state=3): >>><<< 11792 1727096186.95411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096186.95413: _low_level_execute_command(): starting 11792 1727096186.95416: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/AnsiballZ_command.py && sleep 0' 11792 1727096186.96583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096186.96619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.15259: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-23 08:56:27.123880", "end": "2024-09-23 08:56:27.150865", "delta": "0:00:00.026985", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096187.16952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096187.16956: stdout chunk (state=3): >>><<< 11792 1727096187.16958: stderr chunk (state=3): >>><<< 11792 1727096187.17094: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-23 08:56:27.123880", "end": "2024-09-23 08:56:27.150865", "delta": "0:00:00.026985", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096187.17104: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096187.17107: _low_level_execute_command(): starting 11792 1727096187.17109: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096186.8371491-14902-121379686476409/ > /dev/null 2>&1 && sleep 0' 11792 1727096187.18691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096187.18805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.18942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.19031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.20887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.20940: stderr chunk (state=3): >>><<< 11792 1727096187.20985: stdout chunk (state=3): >>><<< 11792 1727096187.21005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096187.21017: handler run complete 11792 1727096187.21103: Evaluated conditional (False): False 11792 1727096187.21121: attempt loop complete, returning result 11792 1727096187.21129: _execute() done 11792 1727096187.21137: dumping result to json 11792 1727096187.21199: done dumping result, returning 11792 1727096187.21213: done running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services [0afff68d-5257-d9c7-3fc0-000000000e56] 11792 1727096187.21223: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e56 11792 1727096187.21517: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e56 11792 1727096187.21520: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026985", "end": "2024-09-23 08:56:27.150865", "rc": 0, "start": "2024-09-23 08:56:27.123880" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11792 1727096187.21593: no more pending results, returning what we have 11792 1727096187.21597: results queue empty 11792 1727096187.21598: checking for any_errors_fatal 11792 1727096187.21610: done checking for any_errors_fatal 11792 1727096187.21611: checking for max_fail_percentage 11792 1727096187.21613: done checking for max_fail_percentage 11792 1727096187.21614: checking to see if all hosts have failed and the running result is not ok 11792 1727096187.21615: done checking to see if all hosts have failed 11792 1727096187.21615: getting the remaining hosts for this loop 11792 1727096187.21617: done getting the remaining hosts for this loop 11792 1727096187.21621: getting the next task for host managed_node2 11792 1727096187.21632: done getting next task for host managed_node2 11792 1727096187.21635: ^ task is: TASK: Check routes and DNS 11792 1727096187.21640: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096187.21644: getting variables 11792 1727096187.21645: in VariableManager get_vars() 11792 1727096187.21802: Calling all_inventory to load vars for managed_node2 11792 1727096187.21805: Calling groups_inventory to load vars for managed_node2 11792 1727096187.21808: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096187.21820: Calling all_plugins_play to load vars for managed_node2 11792 1727096187.21829: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096187.21834: Calling groups_plugins_play to load vars for managed_node2 11792 1727096187.25052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096187.28423: done with get_vars() 11792 1727096187.28459: done getting variables 11792 1727096187.28527: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 08:56:27 -0400 (0:00:00.508) 0:01:09.564 ****** 11792 1727096187.28561: entering _queue_task() for managed_node2/shell 11792 1727096187.29331: worker is 1 (out of 1 available) 11792 1727096187.29345: exiting _queue_task() for managed_node2/shell 11792 1727096187.29359: done queuing things up, now waiting for results queue to drain 11792 1727096187.29361: waiting for pending results... 11792 1727096187.29913: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 11792 1727096187.30384: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e5a 11792 1727096187.30388: variable 'ansible_search_path' from source: unknown 11792 1727096187.30391: variable 'ansible_search_path' from source: unknown 11792 1727096187.30394: calling self._execute() 11792 1727096187.30510: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096187.30602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096187.30616: variable 'omit' from source: magic vars 11792 1727096187.31444: variable 'ansible_distribution_major_version' from source: facts 11792 1727096187.31481: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096187.31523: variable 'omit' from source: magic vars 11792 1727096187.31625: variable 'omit' from source: magic vars 11792 1727096187.31719: variable 'omit' from source: magic vars 11792 1727096187.31873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096187.31903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096187.31928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096187.32028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096187.32058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096187.32092: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096187.32113: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096187.32338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096187.32404: Set connection var ansible_timeout to 10 11792 1727096187.32417: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096187.32429: Set connection var ansible_shell_executable to /bin/sh 11792 1727096187.32464: Set connection var ansible_pipelining to False 11792 1727096187.32665: Set connection var ansible_shell_type to sh 11792 1727096187.32670: Set connection var ansible_connection to ssh 11792 1727096187.32673: variable 'ansible_shell_executable' from source: unknown 11792 1727096187.32675: variable 'ansible_connection' from source: unknown 11792 1727096187.32677: variable 'ansible_module_compression' from source: unknown 11792 1727096187.32679: variable 'ansible_shell_type' from source: unknown 11792 1727096187.32681: variable 'ansible_shell_executable' from source: unknown 11792 1727096187.32682: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096187.32684: variable 'ansible_pipelining' from source: unknown 11792 1727096187.32686: variable 'ansible_timeout' from source: unknown 11792 1727096187.32688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096187.33025: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096187.33044: variable 'omit' from source: magic vars 11792 1727096187.33053: starting attempt loop 11792 1727096187.33062: running the handler 11792 1727096187.33080: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096187.33109: _low_level_execute_command(): starting 11792 1727096187.33166: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096187.34585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.34717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.34848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.34869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.36560: stdout chunk (state=3): >>>/root <<< 11792 1727096187.36696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.36708: stdout chunk (state=3): >>><<< 11792 1727096187.36721: stderr chunk (state=3): >>><<< 11792 1727096187.36759: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096187.36783: _low_level_execute_command(): starting 11792 1727096187.36794: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080 `" && echo ansible-tmp-1727096187.3676865-14922-235506659438080="` echo /root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080 `" ) && sleep 0' 11792 1727096187.37727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096187.37742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.37769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096187.37791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096187.37820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 11792 1727096187.37837: stderr chunk (state=3): >>>debug2: match not found <<< 11792 1727096187.37852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.37881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11792 1727096187.37960: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.37996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.38011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.38030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.38207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.40205: stdout chunk (state=3): >>>ansible-tmp-1727096187.3676865-14922-235506659438080=/root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080 <<< 11792 1727096187.40305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.40503: stderr chunk (state=3): >>><<< 11792 1727096187.40506: stdout chunk (state=3): >>><<< 11792 1727096187.40509: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096187.3676865-14922-235506659438080=/root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096187.40512: variable 'ansible_module_compression' from source: unknown 11792 1727096187.40514: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096187.40555: variable 'ansible_facts' from source: unknown 11792 1727096187.40649: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/AnsiballZ_command.py 11792 1727096187.40904: Sending initial data 11792 1727096187.40908: Sent initial data (156 bytes) 11792 1727096187.41934: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096187.41982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.42028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.42086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.42136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.42149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.42202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.43849: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096187.43904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096187.43991: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpovc3vdj2 /root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/AnsiballZ_command.py <<< 11792 1727096187.43995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpovc3vdj2" to remote "/root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/AnsiballZ_command.py" <<< 11792 1727096187.43998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/AnsiballZ_command.py" <<< 11792 1727096187.45112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.45119: stdout chunk (state=3): >>><<< 11792 1727096187.45121: stderr chunk (state=3): >>><<< 11792 1727096187.45127: done transferring module to remote 11792 1727096187.45130: _low_level_execute_command(): starting 11792 1727096187.45132: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/ /root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/AnsiballZ_command.py && sleep 0' 11792 1727096187.45760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096187.45783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.45788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096187.45807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096187.45812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.45844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.45922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 11792 1727096187.45937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.45940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.45943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.45945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.46007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.47967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.47973: stdout chunk (state=3): >>><<< 11792 1727096187.47975: stderr chunk (state=3): >>><<< 11792 1727096187.48074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096187.48078: _low_level_execute_command(): starting 11792 1727096187.48080: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/AnsiballZ_command.py && sleep 0' 11792 1727096187.48595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.48600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096187.48611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.48680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.48764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.48830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.65483: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3420sec preferred_lft 3420sec\n inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:56:27.644184", "end": "2024-09-23 08:56:27.653356", "delta": "0:00:00.009172", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096187.67177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096187.67203: stderr chunk (state=3): >>><<< 11792 1727096187.67206: stdout chunk (state=3): >>><<< 11792 1727096187.67226: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3420sec preferred_lft 3420sec\n inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:56:27.644184", "end": "2024-09-23 08:56:27.653356", "delta": "0:00:00.009172", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096187.67262: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096187.67270: _low_level_execute_command(): starting 11792 1727096187.67275: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096187.3676865-14922-235506659438080/ > /dev/null 2>&1 && sleep 0' 11792 1727096187.67731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.67735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.67737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.67739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.67796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.67799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.67801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.67842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.69723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.69746: stderr chunk (state=3): >>><<< 11792 1727096187.69750: stdout chunk (state=3): >>><<< 11792 1727096187.69770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096187.69777: handler run complete 11792 1727096187.69797: Evaluated conditional (False): False 11792 1727096187.69806: attempt loop complete, returning result 11792 1727096187.69808: _execute() done 11792 1727096187.69811: dumping result to json 11792 1727096187.69816: done dumping result, returning 11792 1727096187.69824: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0afff68d-5257-d9c7-3fc0-000000000e5a] 11792 1727096187.69828: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e5a 11792 1727096187.69929: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e5a 11792 1727096187.69932: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009172", "end": "2024-09-23 08:56:27.653356", "rc": 0, "start": "2024-09-23 08:56:27.644184" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3420sec preferred_lft 3420sec inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11792 1727096187.69999: no more pending results, returning what we have 11792 1727096187.70003: results queue empty 11792 1727096187.70004: checking for any_errors_fatal 11792 1727096187.70014: done checking for any_errors_fatal 11792 1727096187.70015: checking for max_fail_percentage 11792 1727096187.70016: done checking for max_fail_percentage 11792 1727096187.70017: checking to see if all hosts have failed and the running result is not ok 11792 1727096187.70018: done checking to see if all hosts have failed 11792 1727096187.70018: getting the remaining hosts for this loop 11792 1727096187.70020: done getting the remaining hosts for this loop 11792 1727096187.70023: getting the next task for host managed_node2 11792 1727096187.70031: done getting next task for host managed_node2 11792 1727096187.70033: ^ task is: TASK: Verify DNS and network connectivity 11792 1727096187.70037: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096187.70049: getting variables 11792 1727096187.70050: in VariableManager get_vars() 11792 1727096187.70099: Calling all_inventory to load vars for managed_node2 11792 1727096187.70102: Calling groups_inventory to load vars for managed_node2 11792 1727096187.70104: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096187.70114: Calling all_plugins_play to load vars for managed_node2 11792 1727096187.70117: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096187.70119: Calling groups_plugins_play to load vars for managed_node2 11792 1727096187.70928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096187.71790: done with get_vars() 11792 1727096187.71816: done getting variables 11792 1727096187.71862: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 08:56:27 -0400 (0:00:00.433) 0:01:09.998 ****** 11792 1727096187.71888: entering _queue_task() for managed_node2/shell 11792 1727096187.72160: worker is 1 (out of 1 available) 11792 1727096187.72173: exiting _queue_task() for managed_node2/shell 11792 1727096187.72187: done queuing things up, now waiting for results queue to drain 11792 1727096187.72189: waiting for pending results... 11792 1727096187.72379: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 11792 1727096187.72477: in run() - task 0afff68d-5257-d9c7-3fc0-000000000e5b 11792 1727096187.72491: variable 'ansible_search_path' from source: unknown 11792 1727096187.72494: variable 'ansible_search_path' from source: unknown 11792 1727096187.72529: calling self._execute() 11792 1727096187.72610: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096187.72616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096187.72627: variable 'omit' from source: magic vars 11792 1727096187.72911: variable 'ansible_distribution_major_version' from source: facts 11792 1727096187.72920: Evaluated conditional (ansible_distribution_major_version != '6'): True 11792 1727096187.73018: variable 'ansible_facts' from source: unknown 11792 1727096187.73625: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11792 1727096187.73628: variable 'omit' from source: magic vars 11792 1727096187.73674: variable 'omit' from source: magic vars 11792 1727096187.73688: variable 'omit' from source: magic vars 11792 1727096187.73732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11792 1727096187.73762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11792 1727096187.73780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11792 1727096187.73793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096187.73802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11792 1727096187.73826: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11792 1727096187.73830: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096187.73834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096187.73905: Set connection var ansible_timeout to 10 11792 1727096187.73912: Set connection var ansible_module_compression to ZIP_DEFLATED 11792 1727096187.73919: Set connection var ansible_shell_executable to /bin/sh 11792 1727096187.73925: Set connection var ansible_pipelining to False 11792 1727096187.73928: Set connection var ansible_shell_type to sh 11792 1727096187.73930: Set connection var ansible_connection to ssh 11792 1727096187.73953: variable 'ansible_shell_executable' from source: unknown 11792 1727096187.73959: variable 'ansible_connection' from source: unknown 11792 1727096187.73961: variable 'ansible_module_compression' from source: unknown 11792 1727096187.73964: variable 'ansible_shell_type' from source: unknown 11792 1727096187.73966: variable 'ansible_shell_executable' from source: unknown 11792 1727096187.73970: variable 'ansible_host' from source: host vars for 'managed_node2' 11792 1727096187.73972: variable 'ansible_pipelining' from source: unknown 11792 1727096187.73974: variable 'ansible_timeout' from source: unknown 11792 1727096187.73976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11792 1727096187.74079: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096187.74088: variable 'omit' from source: magic vars 11792 1727096187.74093: starting attempt loop 11792 1727096187.74097: running the handler 11792 1727096187.74106: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11792 1727096187.74122: _low_level_execute_command(): starting 11792 1727096187.74129: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11792 1727096187.74652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096187.74657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 11792 1727096187.74662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.74714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.74717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.74719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.74768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.76479: stdout chunk (state=3): >>>/root <<< 11792 1727096187.76561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.76674: stderr chunk (state=3): >>><<< 11792 1727096187.76677: stdout chunk (state=3): >>><<< 11792 1727096187.76681: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096187.76683: _low_level_execute_command(): starting 11792 1727096187.76687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553 `" && echo ansible-tmp-1727096187.7662072-14951-184051211235553="` echo /root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553 `" ) && sleep 0' 11792 1727096187.77109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.77122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.77127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.77129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.77171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.77175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.77187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.77230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.79307: stdout chunk (state=3): >>>ansible-tmp-1727096187.7662072-14951-184051211235553=/root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553 <<< 11792 1727096187.79385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.79419: stderr chunk (state=3): >>><<< 11792 1727096187.79422: stdout chunk (state=3): >>><<< 11792 1727096187.79440: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096187.7662072-14951-184051211235553=/root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096187.79472: variable 'ansible_module_compression' from source: unknown 11792 1727096187.79522: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11792_gzkr3tr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11792 1727096187.79551: variable 'ansible_facts' from source: unknown 11792 1727096187.79610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/AnsiballZ_command.py 11792 1727096187.79717: Sending initial data 11792 1727096187.79721: Sent initial data (156 bytes) 11792 1727096187.80181: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096187.80185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 11792 1727096187.80187: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.80243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.80246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.80248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.80291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.82025: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11792 1727096187.82071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11792 1727096187.82110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpv_q4sk8h /root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/AnsiballZ_command.py <<< 11792 1727096187.82113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/AnsiballZ_command.py" <<< 11792 1727096187.82173: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11792 1727096187.82177: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11792_gzkr3tr/tmpv_q4sk8h" to remote "/root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/AnsiballZ_command.py" <<< 11792 1727096187.82687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.82739: stderr chunk (state=3): >>><<< 11792 1727096187.82743: stdout chunk (state=3): >>><<< 11792 1727096187.82773: done transferring module to remote 11792 1727096187.82782: _low_level_execute_command(): starting 11792 1727096187.82787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/ /root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/AnsiballZ_command.py && sleep 0' 11792 1727096187.83247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096187.83251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.83258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096187.83261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.83314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.83317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.83320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.83359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096187.85341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096187.85349: stdout chunk (state=3): >>><<< 11792 1727096187.85352: stderr chunk (state=3): >>><<< 11792 1727096187.85383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096187.85480: _low_level_execute_command(): starting 11792 1727096187.85484: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/AnsiballZ_command.py && sleep 0' 11792 1727096187.86070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096187.86086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096187.86138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096187.86210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096187.86237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096187.86260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096187.86349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096188.22449: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1983 0 --:--:-- --:--:-- --:--:-- 1993\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 13963 0 --:--:-- --:--:-- --:--:-- 14550", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:56:28.024027", "end": "2024-09-23 08:56:28.222835", "delta": "0:00:00.198808", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11792 1727096188.24174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 11792 1727096188.24179: stdout chunk (state=3): >>><<< 11792 1727096188.24181: stderr chunk (state=3): >>><<< 11792 1727096188.24335: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1983 0 --:--:-- --:--:-- --:--:-- 1993\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 13963 0 --:--:-- --:--:-- --:--:-- 14550", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:56:28.024027", "end": "2024-09-23 08:56:28.222835", "delta": "0:00:00.198808", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 11792 1727096188.24346: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11792 1727096188.24349: _low_level_execute_command(): starting 11792 1727096188.24352: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096187.7662072-14951-184051211235553/ > /dev/null 2>&1 && sleep 0' 11792 1727096188.24937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11792 1727096188.24954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096188.24982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11792 1727096188.25015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11792 1727096188.25030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11792 1727096188.25084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11792 1727096188.25129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 11792 1727096188.25147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11792 1727096188.25171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11792 1727096188.25299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11792 1727096188.27207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11792 1727096188.27219: stdout chunk (state=3): >>><<< 11792 1727096188.27236: stderr chunk (state=3): >>><<< 11792 1727096188.27374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11792 1727096188.27377: handler run complete 11792 1727096188.27380: Evaluated conditional (False): False 11792 1727096188.27382: attempt loop complete, returning result 11792 1727096188.27384: _execute() done 11792 1727096188.27386: dumping result to json 11792 1727096188.27387: done dumping result, returning 11792 1727096188.27389: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0afff68d-5257-d9c7-3fc0-000000000e5b] 11792 1727096188.27391: sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e5b 11792 1727096188.27466: done sending task result for task 0afff68d-5257-d9c7-3fc0-000000000e5b 11792 1727096188.27472: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.198808", "end": "2024-09-23 08:56:28.222835", "rc": 0, "start": "2024-09-23 08:56:28.024027" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1983 0 --:--:-- --:--:-- --:--:-- 1993 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 13963 0 --:--:-- --:--:-- --:--:-- 14550 11792 1727096188.27637: no more pending results, returning what we have 11792 1727096188.27640: results queue empty 11792 1727096188.27641: checking for any_errors_fatal 11792 1727096188.27650: done checking for any_errors_fatal 11792 1727096188.27651: checking for max_fail_percentage 11792 1727096188.27653: done checking for max_fail_percentage 11792 1727096188.27654: checking to see if all hosts have failed and the running result is not ok 11792 1727096188.27655: done checking to see if all hosts have failed 11792 1727096188.27658: getting the remaining hosts for this loop 11792 1727096188.27665: done getting the remaining hosts for this loop 11792 1727096188.27671: getting the next task for host managed_node2 11792 1727096188.27685: done getting next task for host managed_node2 11792 1727096188.27688: ^ task is: TASK: meta (flush_handlers) 11792 1727096188.27690: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096188.27695: getting variables 11792 1727096188.27696: in VariableManager get_vars() 11792 1727096188.27738: Calling all_inventory to load vars for managed_node2 11792 1727096188.27741: Calling groups_inventory to load vars for managed_node2 11792 1727096188.27743: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096188.27753: Calling all_plugins_play to load vars for managed_node2 11792 1727096188.27755: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096188.27760: Calling groups_plugins_play to load vars for managed_node2 11792 1727096188.34608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096188.36138: done with get_vars() 11792 1727096188.36176: done getting variables 11792 1727096188.36239: in VariableManager get_vars() 11792 1727096188.36259: Calling all_inventory to load vars for managed_node2 11792 1727096188.36261: Calling groups_inventory to load vars for managed_node2 11792 1727096188.36263: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096188.36271: Calling all_plugins_play to load vars for managed_node2 11792 1727096188.36274: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096188.36277: Calling groups_plugins_play to load vars for managed_node2 11792 1727096188.37382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096188.39058: done with get_vars() 11792 1727096188.39088: done queuing things up, now waiting for results queue to drain 11792 1727096188.39091: results queue empty 11792 1727096188.39092: checking for any_errors_fatal 11792 1727096188.39096: done checking for any_errors_fatal 11792 1727096188.39097: checking for max_fail_percentage 11792 1727096188.39098: done checking for max_fail_percentage 11792 1727096188.39098: checking to see if all hosts have failed and the running result is not ok 11792 1727096188.39099: done checking to see if all hosts have failed 11792 1727096188.39100: getting the remaining hosts for this loop 11792 1727096188.39101: done getting the remaining hosts for this loop 11792 1727096188.39104: getting the next task for host managed_node2 11792 1727096188.39108: done getting next task for host managed_node2 11792 1727096188.39109: ^ task is: TASK: meta (flush_handlers) 11792 1727096188.39111: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096188.39113: getting variables 11792 1727096188.39114: in VariableManager get_vars() 11792 1727096188.39130: Calling all_inventory to load vars for managed_node2 11792 1727096188.39132: Calling groups_inventory to load vars for managed_node2 11792 1727096188.39134: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096188.39140: Calling all_plugins_play to load vars for managed_node2 11792 1727096188.39143: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096188.39145: Calling groups_plugins_play to load vars for managed_node2 11792 1727096188.40280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096188.41819: done with get_vars() 11792 1727096188.41850: done getting variables 11792 1727096188.41906: in VariableManager get_vars() 11792 1727096188.41925: Calling all_inventory to load vars for managed_node2 11792 1727096188.41927: Calling groups_inventory to load vars for managed_node2 11792 1727096188.41929: Calling all_plugins_inventory to load vars for managed_node2 11792 1727096188.41934: Calling all_plugins_play to load vars for managed_node2 11792 1727096188.41937: Calling groups_plugins_inventory to load vars for managed_node2 11792 1727096188.41940: Calling groups_plugins_play to load vars for managed_node2 11792 1727096188.43189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11792 1727096188.44763: done with get_vars() 11792 1727096188.44801: done queuing things up, now waiting for results queue to drain 11792 1727096188.44803: results queue empty 11792 1727096188.44804: checking for any_errors_fatal 11792 1727096188.44806: done checking for any_errors_fatal 11792 1727096188.44806: checking for max_fail_percentage 11792 1727096188.44807: done checking for max_fail_percentage 11792 1727096188.44808: checking to see if all hosts have failed and the running result is not ok 11792 1727096188.44809: done checking to see if all hosts have failed 11792 1727096188.44810: getting the remaining hosts for this loop 11792 1727096188.44811: done getting the remaining hosts for this loop 11792 1727096188.44814: getting the next task for host managed_node2 11792 1727096188.44817: done getting next task for host managed_node2 11792 1727096188.44818: ^ task is: None 11792 1727096188.44819: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11792 1727096188.44820: done queuing things up, now waiting for results queue to drain 11792 1727096188.44821: results queue empty 11792 1727096188.44822: checking for any_errors_fatal 11792 1727096188.44823: done checking for any_errors_fatal 11792 1727096188.44823: checking for max_fail_percentage 11792 1727096188.44824: done checking for max_fail_percentage 11792 1727096188.44825: checking to see if all hosts have failed and the running result is not ok 11792 1727096188.44825: done checking to see if all hosts have failed 11792 1727096188.44827: getting the next task for host managed_node2 11792 1727096188.44829: done getting next task for host managed_node2 11792 1727096188.44830: ^ task is: None 11792 1727096188.44831: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=148 changed=5 unreachable=0 failed=0 skipped=97 rescued=0 ignored=0 Monday 23 September 2024 08:56:28 -0400 (0:00:00.730) 0:01:10.728 ****** =============================================================================== ** TEST check bond settings --------------------------------------------- 5.91s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 ** TEST check bond settings --------------------------------------------- 2.91s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.18s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install dnsmasq --------------------------------------------------------- 2.18s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.96s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.96s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.87s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Create test interfaces -------------------------------------------------- 1.81s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Create test interfaces -------------------------------------------------- 1.58s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.27s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.18s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.06s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.03s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.99s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.96s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.90s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.90s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.86s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.85s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 11792 1727096188.44953: RUNNING CLEANUP